Mercurial > pylearn
changeset 1116:18a092001752
An idea about Datasets and GPU.
author | Arnaud Bergeron <abergeron@gmail.com> |
---|---|
date | Tue, 14 Sep 2010 14:20:31 -0400 |
parents | 967975f9c574 |
children | c1943feada10 |
files | doc/v2_planning/dataset.txt |
diffstat | 1 files changed, 8 insertions(+), 0 deletions(-) [+] |
line wrap: on
line diff
--- a/doc/v2_planning/dataset.txt Tue Sep 14 13:41:48 2010 -0400 +++ b/doc/v2_planning/dataset.txt Tue Sep 14 14:20:31 2010 -0400 @@ -364,3 +364,11 @@ theano fct will be to transfer the dataset to the gpu before computation. If the data change at each call, that will be as efficient as changing the data manually every time in the shared variable. + +AB: I have an idea about this which kind of fits in the "building a +theano op" thing that we talked about at the last meeting. + +We could have a specialezed theano op that takes a dataset and returns +chunks of it with a index using the standard Dataset interface. The +code to transfer to the GPU or whatever goes in that Op and we don't +need to change to dataset interface.