# HG changeset patch # User Razvan Pascanu # Date 1284255758 14400 # Node ID f1521635652237e6a9660e6f435dcd3ba1153ba9 # Parent e254065e7fd725c967de121a75cedab9e700c5a0 Did the dataset committee decide to include some GPU support ( use shared variables ) atleast in some cases ? diff -r e254065e7fd7 -r f15216356522 doc/v2_planning/dataset.txt --- a/doc/v2_planning/dataset.txt Sat Sep 11 21:07:06 2010 -0400 +++ b/doc/v2_planning/dataset.txt Sat Sep 11 21:42:38 2010 -0400 @@ -300,3 +300,18 @@ worth at least keeping in mind this close relationship between simple processing and learning, and thinking about what are the benefits / drawbacks in keeping them separate in the class hierarchy. + +RP: I actually like this idea of having the dataset implement the same +interface as the learner ( or actually a subset of the interface .. ). +I hope people decide to do this. + + + +RP asks: What is the status of having the dataset support copying data +on the GPU ( by storing data in shared variables) ? Have you decided to +include this feature or not ? I think that the strongest selling point of +Theano is that it runs on GPU transperently, and I see this as a good +selling point for the library as well. Plus we intend to move more and +more towards running things on GPU. If the dataset object does not support +this feature we will need to find hacks around it .. +