Mercurial > pylearn
changeset 1168:77b6ed85d3f7
Update doc of learner's API
author | Pascal Lamblin <lamblinp@iro.umontreal.ca> |
---|---|
date | Fri, 17 Sep 2010 14:29:40 -0400 |
parents | 7a8dcf87d780 |
children | 3a1225034751 |
files | doc/v2_planning/API_learner.txt |
diffstat | 1 files changed, 27 insertions(+), 8 deletions(-) [+] |
line wrap: on
line diff
--- a/doc/v2_planning/API_learner.txt Fri Sep 17 13:57:46 2010 -0400 +++ b/doc/v2_planning/API_learner.txt Fri Sep 17 14:29:40 2010 -0400 @@ -1,11 +1,5 @@ - +# A list of "task types" -def bagging(learner_factory): - for i in range(N): - learner_i = learner_factory.new() - # todo: get dataset_i ?? - learner_i.use_dataset(dataset_i) - learner_i.train() ''' List of tasks types: Attributes @@ -44,8 +38,22 @@ - some are applied to an example, others on a batch - most statistics are on the dataset ''' + + class Learner(Object): - + ''' + Takes data as inputs, and learns a prediction function (or several). + + A learner is parametrized by hyper-parameters, which can be set from the + outside (a "client" from Learner, that can be a HyperLearner, a + Tester,...). + + The data can be given all at a time as a data set, or incrementally. + Some learner need to be fully trained in one step, whereas other can be + trained incrementally. + + The question of statistics collection during training remains open. + ''' #def use_dataset(dataset) # return a dictionary of hyperparameters names(keys) @@ -70,7 +78,18 @@ # + +# Some example cases + class HyperLearner(Learner): ### def get_hyper_parameter_distribution(name) def set_hyper_parameters_distribution(dictionary) + + +def bagging(learner_factory): + for i in range(N): + learner_i = learner_factory.new() + # todo: get dataset_i ?? + learner_i.use_dataset(dataset_i) + learner_i.train()