Mercurial > pylearn
comparison gradient_learner.py @ 23:526e192b0699
Working on ApplyFunctionDataSet, added constraint that
DataSet iterators must have a next_index() method.
author | bengioy@esprit.iro.umontreal.ca |
---|---|
date | Wed, 09 Apr 2008 18:27:13 -0400 |
parents | 266c68cb6136 |
children | 672fe4b23032 |
comparison
equal
deleted
inserted
replaced
22:b6b36f65664f | 23:526e192b0699 |
---|---|
7 class GradientLearner(Learner): | 7 class GradientLearner(Learner): |
8 """ | 8 """ |
9 Base class for gradient-based optimization of a training criterion | 9 Base class for gradient-based optimization of a training criterion |
10 that can consist in two parts, an additive part over examples, and | 10 that can consist in two parts, an additive part over examples, and |
11 an example-independent part (usually called the regularizer). | 11 an example-independent part (usually called the regularizer). |
12 The user provides a Theano formula that maps the fields of a training example | 12 The user provides a Theano formula that maps the fields of a minibatch (each being a tensor with the |
13 and parameters to output fields (for the use function), one of which must be a cost | 13 same number of rows = minibatch size) and parameters to output fields (for the use function), one of which |
14 that is the training criterion to be minimized. Subclasses implement | 14 must be a cost that is the training criterion to be minimized. Subclasses implement |
15 a training strategy that uses the Theano formula to compute gradients and | 15 a training strategy that uses the Theano formula to compute gradients and |
16 to compute outputs in the update method. | 16 to compute outputs in the update method. |
17 The inputs, parameters, and outputs are lists of Theano tensors, | 17 The inputs, parameters, and outputs are lists of Theano tensors, |
18 while the example_wise_cost and regularization_term are Theano tensors. | 18 while the example_wise_cost and regularization_term are Theano tensors. |
19 The user can specify a regularization coefficient that multiplies the regularization term. | 19 The user can specify a regularization coefficient that multiplies the regularization term. |
53 | 53 |
54 use_function_key = input_fields+output_fields | 54 use_function_key = input_fields+output_fields |
55 if not self.use_functions.has_key(use_function_key): | 55 if not self.use_functions.has_key(use_function_key): |
56 self.use_function[use_function_key]=Function(input_fields,output_fields) | 56 self.use_function[use_function_key]=Function(input_fields,output_fields) |
57 use_function = self.use_functions[use_function_key] | 57 use_function = self.use_functions[use_function_key] |
58 # return a virtual dataset that computes the outputs on demand | 58 # return a dataset that computes the outputs |
59 return input_dataset.apply_function(use_function,input_fields,output_fields,copy_inputs,accept_minibatches=???) | 59 return input_dataset.applyFunction(use_function,input_fields,output_fields,copy_inputs,compute_now=True) |
60 | 60 |