Mercurial > pylearn
diff doc/v2_planning/dataset.txt @ 1089:f15216356522
Did the dataset committee decide to include some GPU support ( use shared variables ) atleast in some cases ?
author | Razvan Pascanu <r.pascanu@gmail.com> |
---|---|
date | Sat, 11 Sep 2010 21:42:38 -0400 |
parents | 65ac0f493830 |
children | 319de699fb67 |
line wrap: on
line diff
--- a/doc/v2_planning/dataset.txt Sat Sep 11 21:07:06 2010 -0400 +++ b/doc/v2_planning/dataset.txt Sat Sep 11 21:42:38 2010 -0400 @@ -300,3 +300,18 @@ worth at least keeping in mind this close relationship between simple processing and learning, and thinking about what are the benefits / drawbacks in keeping them separate in the class hierarchy. + +RP: I actually like this idea of having the dataset implement the same +interface as the learner ( or actually a subset of the interface .. ). +I hope people decide to do this. + + + +RP asks: What is the status of having the dataset support copying data +on the GPU ( by storing data in shared variables) ? Have you decided to +include this feature or not ? I think that the strongest selling point of +Theano is that it runs on GPU transperently, and I see this as a good +selling point for the library as well. Plus we intend to move more and +more towards running things on GPU. If the dataset object does not support +this feature we will need to find hacks around it .. +