Mercurial > pylearn
diff doc/v2_planning/dataset.txt @ 1110:4797a4cb73e1
added comment to dataset.
author | Frederic Bastien <nouiz@nouiz.org> |
---|---|
date | Tue, 14 Sep 2010 09:08:11 -0400 |
parents | 29b48deb6a84 |
children | 18a092001752 |
line wrap: on
line diff
--- a/doc/v2_planning/dataset.txt Tue Sep 14 09:01:16 2010 -0400 +++ b/doc/v2_planning/dataset.txt Tue Sep 14 09:08:11 2010 -0400 @@ -357,3 +357,10 @@ dataset is just a different class. But I'm happy to have all this GPU stuff send to the learner as well if everybody else believe that is better. +FB comment: I don't understand why you would need to recompile the theano function. +Their is 2 cases, the data is in a shared variable. You can directly change the data +in the shared variable without recompiling the theano fct. The second case is when +the dataset is in an ordinary theano variable. In that case, the first step in the +theano fct will be to transfer the dataset to the gpu before computation. If the data +change at each call, that will be as efficient as changing the data manually every time +in the shared variable.