Mercurial > pylearn
view doc/v2_planning/API_learner.txt @ 1239:470beb000694
merge
author | Yoshua Bengio <bengioy@iro.umontreal.ca> |
---|---|
date | Thu, 23 Sep 2010 11:49:42 -0400 |
parents | 805e7c369fd1 |
children | 317049b21b77 |
line wrap: on
line source
.. _v2planning_learner: Learner API =========== # A list of "task types" ''' List of tasks types: Attributes sequential spatial structured semi-supervised missing-values Supervised (x,y) classification regression probabilistic classification ranking conditional density estimation collaborative filtering ordinal regression ?= ranking Unsupervised (x) de-noising feature learning ( transformation ) PCA, DAA density estimation inference Other generation (sampling) structure learning ??? Notes on metrics & statistics: - some are applied to an example, others on a batch - most statistics are on the dataset ''' class Learner(Object): ''' Takes data as inputs, and learns a prediction function (or several). A learner is parametrized by hyper-parameters, which can be set from the outside (a "client" from Learner, that can be a HyperLearner, a Tester,...). The data can be given all at a time as a data set, or incrementally. Some learner need to be fully trained in one step, whereas other can be trained incrementally. The question of statistics collection during training remains open. ''' #def use_dataset(dataset) # return a dictionary of hyperparameters names(keys) # and value(values) def get_hyper_parameters() def set_hyper_parameters(dictionary) # Ver B def eval(dataset) def predict(dataset) # Trainable def train(dataset) # train until complition # Incremental def use_dataset(dataset) def adapt(n_steps =1) def has_converged() # # Some example cases class HyperLearner(Learner): ### def get_hyper_parameter_distribution(name) def set_hyper_parameters_distribution(dictionary) def bagging(learner_factory): for i in range(N): learner_i = learner_factory.new() # todo: get dataset_i ?? learner_i.use_dataset(dataset_i) learner_i.train()