# HG changeset patch # User Frederic Bastien # Date 1284469691 14400 # Node ID 4797a4cb73e17fcf3e9e51811cd01db3905e8d6c # Parent 29b48deb6a84e90f898cb18b9ef38f0110493c8e added comment to dataset. diff -r 29b48deb6a84 -r 4797a4cb73e1 doc/v2_planning/dataset.txt --- a/doc/v2_planning/dataset.txt Tue Sep 14 09:01:16 2010 -0400 +++ b/doc/v2_planning/dataset.txt Tue Sep 14 09:08:11 2010 -0400 @@ -357,3 +357,10 @@ dataset is just a different class. But I'm happy to have all this GPU stuff send to the learner as well if everybody else believe that is better. +FB comment: I don't understand why you would need to recompile the theano function. +Their is 2 cases, the data is in a shared variable. You can directly change the data +in the shared variable without recompiling the theano fct. The second case is when +the dataset is in an ordinary theano variable. In that case, the first step in the +theano fct will be to transfer the dataset to the gpu before computation. If the data +change at each call, that will be as efficient as changing the data manually every time +in the shared variable.