Mercurial > pylearn
comparison doc/v2_planning/dataset.txt @ 1116:18a092001752
An idea about Datasets and GPU.
author | Arnaud Bergeron <abergeron@gmail.com> |
---|---|
date | Tue, 14 Sep 2010 14:20:31 -0400 |
parents | 4797a4cb73e1 |
children | c1943feada10 |
comparison
equal
deleted
inserted
replaced
1115:967975f9c574 | 1116:18a092001752 |
---|---|
362 in the shared variable without recompiling the theano fct. The second case is when | 362 in the shared variable without recompiling the theano fct. The second case is when |
363 the dataset is in an ordinary theano variable. In that case, the first step in the | 363 the dataset is in an ordinary theano variable. In that case, the first step in the |
364 theano fct will be to transfer the dataset to the gpu before computation. If the data | 364 theano fct will be to transfer the dataset to the gpu before computation. If the data |
365 change at each call, that will be as efficient as changing the data manually every time | 365 change at each call, that will be as efficient as changing the data manually every time |
366 in the shared variable. | 366 in the shared variable. |
367 | |
368 AB: I have an idea about this which kind of fits in the "building a | |
369 theano op" thing that we talked about at the last meeting. | |
370 | |
371 We could have a specialezed theano op that takes a dataset and returns | |
372 chunks of it with a index using the standard Dataset interface. The | |
373 code to transfer to the GPU or whatever goes in that Op and we don't | |
374 need to change to dataset interface. |