Mercurial > pylearn
changeset 1240:317049b21b77
RST in API_learner
author | Pascal Lamblin <lamblinp@iro.umontreal.ca> |
---|---|
date | Thu, 23 Sep 2010 12:02:16 -0400 |
parents | 470beb000694 |
children | 6801451a86bb |
files | doc/v2_planning/API_learner.txt |
diffstat | 1 files changed, 82 insertions(+), 66 deletions(-) [+] |
line wrap: on
line diff
--- a/doc/v2_planning/API_learner.txt Thu Sep 23 11:49:42 2010 -0400 +++ b/doc/v2_planning/API_learner.txt Thu Sep 23 12:02:16 2010 -0400 @@ -3,99 +3,115 @@ Learner API =========== -# A list of "task types" +A list of "task types" +---------------------- -''' - List of tasks types: - Attributes +Attributes +~~~~~~~~~~ - sequential - spatial - structured - semi-supervised - missing-values +- sequential +- spatial +- structured +- semi-supervised +- missing-values - Supervised (x,y) +Supervised (x,y) +~~~~~~~~~~~~~~~~ - classification - regression - probabilistic classification - ranking - conditional density estimation - collaborative filtering - ordinal regression ?= ranking +- classification +- regression +- probabilistic classification +- ranking +- conditional density estimation +- collaborative filtering +- ordinal regression ?= ranking - Unsupervised (x) +Unsupervised (x) +~~~~~~~~~~~~~~~~ - de-noising - feature learning ( transformation ) PCA, DAA - density estimation - inference +- de-noising +- feature learning ( transformation ) PCA, DAA +- density estimation +- inference - Other +Other +~~~~~ - generation (sampling) - structure learning ??? +- generation (sampling) +- structure learning ??? -Notes on metrics & statistics: +Notes on metrics & statistics +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - some are applied to an example, others on a batch - most statistics are on the dataset -''' +The Learner class +----------------- + +.. code-block:: python -class Learner(Object): - ''' - Takes data as inputs, and learns a prediction function (or several). + class Learner(Object): + ''' + Takes data as inputs, and learns a prediction function (or several). - A learner is parametrized by hyper-parameters, which can be set from the - outside (a "client" from Learner, that can be a HyperLearner, a - Tester,...). + A learner is parametrized by hyper-parameters, which can be set from the + outside (a "client" from Learner, that can be a HyperLearner, a + Tester,...). - The data can be given all at a time as a data set, or incrementally. - Some learner need to be fully trained in one step, whereas other can be - trained incrementally. + The data can be given all at a time as a data set, or incrementally. + Some learner need to be fully trained in one step, whereas other can be + trained incrementally. - The question of statistics collection during training remains open. - ''' - #def use_dataset(dataset) + The question of statistics collection during training remains open. + ''' + #def use_dataset(dataset) - # return a dictionary of hyperparameters names(keys) - # and value(values) - def get_hyper_parameters() - def set_hyper_parameters(dictionary) + # return a dictionary of hyperparameters names(keys) + # and value(values) + def get_hyper_parameters(): + ... + def set_hyper_parameters(dictionary): + ... - - - # Ver B - def eval(dataset) - def predict(dataset) + # Ver B + def eval(dataset): + ... + def predict(dataset): + ... - # Trainable - def train(dataset) # train until complition + # Trainable + def train(dataset): # train until completion + ... - # Incremental - def use_dataset(dataset) - def adapt(n_steps =1) - def has_converged() - - # + # Incremental + def use_dataset(dataset): + ... + def adapt(n_steps=1): + ... + def has_converged(): + ... + # -# Some example cases +Some example cases +------------------ + +.. code-block:: python -class HyperLearner(Learner): + class HyperLearner(Learner): - ### def get_hyper_parameter_distribution(name) - def set_hyper_parameters_distribution(dictionary) + ### def get_hyper_parameter_distribution(name) + def set_hyper_parameters_distribution(dictionary): + ... -def bagging(learner_factory): - for i in range(N): - learner_i = learner_factory.new() - # todo: get dataset_i ?? - learner_i.use_dataset(dataset_i) - learner_i.train() + def bagging(learner_factory): + for i in range(N): + learner_i = learner_factory.new() + # todo: get dataset_i ?? + learner_i.use_dataset(dataset_i) + learner_i.train()