Mercurial > pylearn
diff gradient_learner.py @ 23:526e192b0699
Working on ApplyFunctionDataSet, added constraint that
DataSet iterators must have a next_index() method.
author | bengioy@esprit.iro.umontreal.ca |
---|---|
date | Wed, 09 Apr 2008 18:27:13 -0400 |
parents | 266c68cb6136 |
children | 672fe4b23032 |
line wrap: on
line diff
--- a/gradient_learner.py Mon Apr 07 20:44:37 2008 -0400 +++ b/gradient_learner.py Wed Apr 09 18:27:13 2008 -0400 @@ -9,9 +9,9 @@ Base class for gradient-based optimization of a training criterion that can consist in two parts, an additive part over examples, and an example-independent part (usually called the regularizer). - The user provides a Theano formula that maps the fields of a training example - and parameters to output fields (for the use function), one of which must be a cost - that is the training criterion to be minimized. Subclasses implement + The user provides a Theano formula that maps the fields of a minibatch (each being a tensor with the + same number of rows = minibatch size) and parameters to output fields (for the use function), one of which + must be a cost that is the training criterion to be minimized. Subclasses implement a training strategy that uses the Theano formula to compute gradients and to compute outputs in the update method. The inputs, parameters, and outputs are lists of Theano tensors, @@ -55,6 +55,6 @@ if not self.use_functions.has_key(use_function_key): self.use_function[use_function_key]=Function(input_fields,output_fields) use_function = self.use_functions[use_function_key] - # return a virtual dataset that computes the outputs on demand - return input_dataset.apply_function(use_function,input_fields,output_fields,copy_inputs,accept_minibatches=???) + # return a dataset that computes the outputs + return input_dataset.applyFunction(use_function,input_fields,output_fields,copy_inputs,compute_now=True)