Mercurial > pylearn
comparison doc/v2_planning/dataset.txt @ 1089:f15216356522
Did the dataset committee decide to include some GPU support ( use shared variables ) atleast in some cases ?
author | Razvan Pascanu <r.pascanu@gmail.com> |
---|---|
date | Sat, 11 Sep 2010 21:42:38 -0400 |
parents | 65ac0f493830 |
children | 319de699fb67 |
comparison
equal
deleted
inserted
replaced
1088:e254065e7fd7 | 1089:f15216356522 |
---|---|
298 | 298 |
299 I am not saying that we should necessarily do it this way, but I think it is | 299 I am not saying that we should necessarily do it this way, but I think it is |
300 worth at least keeping in mind this close relationship between simple | 300 worth at least keeping in mind this close relationship between simple |
301 processing and learning, and thinking about what are the benefits / drawbacks | 301 processing and learning, and thinking about what are the benefits / drawbacks |
302 in keeping them separate in the class hierarchy. | 302 in keeping them separate in the class hierarchy. |
303 | |
304 RP: I actually like this idea of having the dataset implement the same | |
305 interface as the learner ( or actually a subset of the interface .. ). | |
306 I hope people decide to do this. | |
307 | |
308 | |
309 | |
310 RP asks: What is the status of having the dataset support copying data | |
311 on the GPU ( by storing data in shared variables) ? Have you decided to | |
312 include this feature or not ? I think that the strongest selling point of | |
313 Theano is that it runs on GPU transperently, and I see this as a good | |
314 selling point for the library as well. Plus we intend to move more and | |
315 more towards running things on GPU. If the dataset object does not support | |
316 this feature we will need to find hacks around it .. | |
317 |