Mercurial > pylearn
changeset 1089:f15216356522
Did the dataset committee decide to include some GPU support ( use shared variables ) atleast in some cases ?
author | Razvan Pascanu <r.pascanu@gmail.com> |
---|---|
date | Sat, 11 Sep 2010 21:42:38 -0400 |
parents | e254065e7fd7 |
children | a80b296eb0df |
files | doc/v2_planning/dataset.txt |
diffstat | 1 files changed, 15 insertions(+), 0 deletions(-) [+] |
line wrap: on
line diff
--- a/doc/v2_planning/dataset.txt Sat Sep 11 21:07:06 2010 -0400 +++ b/doc/v2_planning/dataset.txt Sat Sep 11 21:42:38 2010 -0400 @@ -300,3 +300,18 @@ worth at least keeping in mind this close relationship between simple processing and learning, and thinking about what are the benefits / drawbacks in keeping them separate in the class hierarchy. + +RP: I actually like this idea of having the dataset implement the same +interface as the learner ( or actually a subset of the interface .. ). +I hope people decide to do this. + + + +RP asks: What is the status of having the dataset support copying data +on the GPU ( by storing data in shared variables) ? Have you decided to +include this feature or not ? I think that the strongest selling point of +Theano is that it runs on GPU transperently, and I see this as a good +selling point for the library as well. Plus we intend to move more and +more towards running things on GPU. If the dataset object does not support +this feature we will need to find hacks around it .. +