Mercurial > pylearn
view learner.py @ 115:01aa97a2212d
removed dependency to plearn
author | Frederic Bastien <bastienf@iro.umontreal.ca> |
---|---|
date | Wed, 07 May 2008 12:19:36 -0400 |
parents | 88257dfedf8c |
children | d0a1bd0378c6 |
line wrap: on
line source
from dataset import * from compile import Function class Learner(AttributesHolder): """Base class for learning algorithms, provides an interface that allows various algorithms to be applicable to generic learning algorithms. A Learner can be seen as a learning algorithm, a function that when applied to training data returns a learned function, an object that can be applied to other data and return some output data. """ def __init__(self): pass def forget(self): """ Reset the state of the learner to a blank slate, before seeing training data. The operation may be non-deterministic if the learner has a random number generator that is set to use a different seed each time it forget() is called. """ raise NotImplementedError def update(self,training_set,train_stats_collector=None): """ Continue training a learner, with the evidence provided by the given training set. Hence update can be called multiple times. This is particularly useful in the on-line setting or the sequential (Bayesian or not) settings. The result is a function that can be applied on data, with the same semantics of the Learner.use method. The user may optionally provide a training StatsCollector that is used to record some statistics of the outputs computed during training. It is update(d) during training. """ return self.use # default behavior is 'non-adaptive', i.e. update does not do anything def __call__(self,training_set,train_stats_collector=None): """ Train a learner from scratch using the provided training set, and return the learned function. """ self.forget() return self.update(learning_task,train_stats_collector) def use(self,input_dataset,output_fields=None,copy_inputs=True): """Once a Learner has been trained by one or more call to 'update', it can be used with one or more calls to 'use'. The argument is a DataSet (possibly containing a single example) and the result is a DataSet of the same length. If output_fields is specified, it may be use to indicate which fields should be constructed in the output DataSet (for example ['output','classification_error']). Optionally, if copy_inputs, the input fields (of the input_dataset) can be made visible in the output DataSet returned by this method. """ raise NotImplementedError def attributeNames(self): """ A Learner may have attributes that it wishes to export to other objects. To automate such export, sub-classes should define here the names (list of strings) of these attributes. @todo By default, attributeNames looks for all dictionary entries whose name does not start with _. """ return [] def updateInputAttributes(self): """ A subset of self.attributeNames() which are the names of attributes needed by update() in order to do its work. """ raise AbstractFunction() def useInputAttributes(self): """ A subset of self.attributeNames() which are the names of attributes needed by use() in order to do its work. """ raise AbstractFunction() def updateOutputAttributes(self): """ A subset of self.attributeNames() which are the names of attributes modified/created by update() in order to do its work. By default these are inferred from the various update output attributes: """ return ["parameters"] + self.updateMinibatchOutputAttributes() + self.updateEndOutputAttributes() def useOutputAttributes(self): """ A subset of self.attributeNames() which are the names of attributes modified/created by use() in order to do its work. """ raise AbstractFunction() class TLearner(Learner): """ TLearner is a virtual class of Learners that attempts to factor out of the definition of a learner the steps that are common to many implementations of learning algorithms, so as to leave only 'the equations' to define in particular sub-classes, using Theano. In the default implementations of use and update, it is assumed that the 'use' and 'update' methods visit examples in the input dataset sequentially. In the 'use' method only one pass through the dataset is done, whereas the sub-learner may wish to iterate over the examples multiple times. Subclasses where this basic model is not appropriate can simply redefine update or use. Sub-classes must provide the following functions and functionalities: - attributeNames(): defines all the names of attributes which can be used as fields or attributes in input/output datasets or in stats collectors. All these attributes are expected to be theano.Result objects (with a .data property and recognized by theano.Function for compilation). The sub-class constructor defines the relations between the Theano variables that may be used by 'use' and 'update' or by a stats collector. - defaultOutputFields(input_fields): return a list of default dataset output fields when None are provided by the caller of use. The following naming convention is assumed and important. Attributes whose names are listed in attributeNames() can be of any type, but those that can be referenced as input/output dataset fields or as output attributes in 'use' or as input attributes in the stats collector should be associated with a Theano Result variable. If the exported attribute name is <name>, the corresponding Result name (an internal attribute of the TLearner, created in the sub-class constructor) should be _<name>. Typically <name> will be numpy ndarray and _<name> will be the corresponding Theano Tensor (for symbolic manipulation). @todo pousser dans Learner toute la poutine qui peut l'etre sans etre dependant de Theano """ def __init__(self): Learner.__init__(self) def defaultOutputFields(self, input_fields): """ Return a default list of output field names (to put in the output dataset). This will be used when None are provided (as output_fields) by the caller of the 'use' method. This may involve looking at the input_fields (names) available in the input_dataset. """ raise AbstractFunction() def allocate(self, minibatch): """ This function is called at the beginning of each updateMinibatch and should be used to check that all required attributes have been allocated and initialized (usually this function calls forget() when it has to do an initialization). """ raise AbstractFunction() def minibatchwise_use_functions(self, input_fields, output_fields, stats_collector): """ Private helper function called by the generic TLearner.use. It returns a function that can map the given input fields to the given output fields (along with the attributes that the stats collector needs for its computation. The function called also automatically makes use of the self.useInputAttributes() and sets the self.useOutputAttributes(). """ if not output_fields: output_fields = self.defaultOutputFields(input_fields) if stats_collector: stats_collector_inputs = stats_collector.input2UpdateAttributes() for attribute in stats_collector_inputs: if attribute not in input_fields: output_fields.append(attribute) key = (input_fields,output_fields) if key not in self.use_functions_dictionary: use_input_attributes = self.useInputAttributes() use_output_attributes = self.useOutputAttributes() complete_f = Function(self.names2OpResults(input_fields+use_input_attributes), self.names2OpResults(output_fields+use_output_attributes)) def f(*input_field_values): input_attribute_values = self.names2attributes(use_input_attributes) results = complete_f(*(input_field_values + input_attribute_values)) output_field_values = results[0:len(output_fields)] output_attribute_values = results[len(output_fields):len(results)] if use_output_attributes: self.setAttributes(use_output_attributes,output_attribute_values) return output_field_values self.use_functions_dictionary[key]=f return self.use_functions_dictionary[key] def attributes(self,return_copy=False): """ Return a list with the values of the learner's attributes (or optionally, a deep copy). """ return self.names2attributes(self.attributeNames(),return_copy) def names2attributes(self,names,return_copy=False): """ Private helper function that maps a list of attribute names to a list of (optionally copies) values of attributes. """ if return_copy: return [copy.deepcopy(self.__getattr__(name).data) for name in names] else: return [self.__getattr__(name).data for name in names] def names2OpResults(self,names): """ Private helper function that maps a list of attribute names to a list of corresponding Op Results (with the same name but with a '_' prefix). """ return [self.__getattr__('_'+name).data for name in names] def use(self,input_dataset,output_fieldnames=None,output_attributes=[], test_stats_collector=None,copy_inputs=True, put_stats_in_output_dataset=True): """ The learner tries to compute in the output dataset the output fields specified @todo check if some of the learner attributes are actually SPECIFIED as attributes of the input_dataset, and if so use their values instead of the ones in the learner. The learner tries to compute in the output dataset the output fields specified. If None is specified then self.defaultOutputFields(input_dataset.fieldNames()) is called to determine the output fields. Attributes of the learner can also optionally be copied into the output dataset. If output_attributes is None then all of the attributes in self.AttributeNames() are copied in the output dataset, but if it is [] (the default), then none are copied. If a test_stats_collector is provided, then its attributes (test_stats_collector.AttributeNames()) are also copied into the output dataset attributes. """ minibatchwise_use_function = minibatchwise_use_functions(input_dataset.fieldNames(), output_fieldnames, test_stats_collector) virtual_output_dataset = ApplyFunctionDataSet(input_dataset, minibatchwise_use_function, True,DataSet.numpy_vstack, DataSet.numpy_hstack) # actually force the computation output_dataset = CachedDataSet(virtual_output_dataset,True) if copy_inputs: output_dataset = input_dataset | output_dataset # copy the wanted attributes in the dataset if output_attributes is None: output_attributes = self.attributeNames() if output_attributes: assert set(attribute_names) <= set(self.attributeNames()) output_dataset.setAttributes(output_attributes, self.names2attributes(output_attributes,return_copy=True)) if test_stats_collector: test_stats_collector.update(output_dataset) if put_stats_in_output_dataset: output_dataset.setAttributes(test_stats_collector.attributeNames(), test_stats_collector.attributes()) return output_dataset class MinibatchUpdatesTLearner(TLearner): """ This adds to TLearner a - updateStart(), updateEnd(), updateMinibatch(minibatch), isLastEpoch(): functions executed at the beginning, the end, in the middle (for each minibatch) of the update method, and at the end of each epoch. This model only works for 'online' or one-shot learning that requires going only once through the training data. For more complicated models, more specialized subclasses of TLearner should be used or a learning-algorithm specific update method should be defined. - a 'parameters' attribute which is a list of parameters (whose names are specified by the user's subclass with the parameterAttributes() method) """ def __init__(self): TLearner.__init__(self) self.update_minibatch_function = Function(self.names2OpResults(self.updateMinibatchOutputAttributes()+ self.updateMinibatchInputFields()), self.names2OpResults(self.updateMinibatchOutputAttributes())) self.update_end_function = Function(self.names2OpResults(self.updateEndInputAttributes()), self.names2OpResults(self.updateEndOutputAttributes())) def updateMinibatchInputFields(self): raise AbstractFunction() def updateMinibatchInputAttributes(self): raise AbstractFunction() def updateMinibatchOutputAttributes(self): raise AbstractFunction() def updateEndInputAttributes(self): raise AbstractFunction() def updateEndOutputAttributes(self): raise AbstractFunction() def parameterAttributes(self): raise AbstractFunction() def updateStart(self): pass def updateEnd(self): self.setAttributes(self.updateEndOutputAttributes(), self.update_end_function (self.names2attributes(self.updateEndInputAttributes()))) self.parameters = self.names2attributes(self.parameterAttributes()) def updateMinibatch(self,minibatch): # make sure all required fields are allocated and initialized self.allocate(minibatch) self.setAttributes(self.updateMinibatchOutputAttributes(), self.update_minibatch_function(*(self.names2attributes(self.updateMinibatchInputAttributes())) + minibatch(self.updateMinibatchInputFields()))) def isLastEpoch(self): """ This method is called at the end of each epoch (cycling over the training set). It returns a boolean to indicate if this is the last epoch. By default just do one epoch. """ return True def update(self,training_set,train_stats_collector=None): """ @todo check if some of the learner attributes are actually SPECIFIED in as attributes of the training_set. """ self.updateStart(training_set) stop=False while not stop: if train_stats_collector: train_stats_collector.forget() # restart stats collectin at the beginning of each epoch for minibatch in training_set.minibatches(self.training_set_input_fields, minibatch_size=self.minibatch_size): self.update_minibatch(minibatch) if train_stats_collector: minibatch_set = minibatch.examples() minibatch_set.setAttributes(self.attributeNames(),self.attributes()) train_stats_collector.update(minibatch_set) stop = self.isLastEpoch() self.updateEnd() return self.use class OnlineGradientBasedTLearner(MinibatchUpdatesTLearner): """ Specialization of MinibatchUpdatesTLearner in which the minibatch updates are obtained by performing an online (minibatch-based) gradient step. Sub-classes must define the following methods: """ def __init__(self,truly_online=False): """ If truly_online then only one pass is made through the training set passed to update(). """ self.truly_online=truly_online def isLastEpoch(self): return self.truly_online