Mercurial > pylearn
diff doc/v2_planning/main_plan.txt @ 1051:bc246542d6ff
added file for the formulas commitee.
author | Frederic Bastien <nouiz@nouiz.org> |
---|---|
date | Wed, 08 Sep 2010 15:39:51 -0400 |
parents | 2e515be92a0e |
children | 1ed0719cfbce |
line wrap: on
line diff
--- a/doc/v2_planning/main_plan.txt Wed Sep 08 14:26:35 2010 -0400 +++ b/doc/v2_planning/main_plan.txt Wed Sep 08 15:39:51 2010 -0400 @@ -234,40 +234,6 @@ For each thing with a functional spec (e.g. datasets library, optimization library) make a separate file. - - -pylearn.formulas ----------------- - -Directory with functions for building layers, calculating classification -errors, cross-entropies with various distributions, free energies, etc. This -module would include for the most part global functions, Theano Ops and Theano -optimizations. - -Yoshua: I would break it down in module files, e.g.: - -pylearn.formulas.costs: generic / common cost functions, e.g. various cross-entropies, squared error, -abs. error, various sparsity penalties (L1, Student) - -pylearn.formulas.linear: formulas for linear classifier, linear regression, factor analysis, PCA - -pylearn.formulas.nnet: formulas for building layers of various kinds, various activation functions, -layers which could be plugged with various costs & penalties, and stacked - -pylearn.formulas.ae: formulas for auto-encoders and denoising auto-encoder variants - -pylearn.formulas.noise: formulas for corruption processes - -pylearn.formulas.rbm: energies, free energies, conditional distributions, Gibbs sampling - -pylearn.formulas.trees: formulas for decision trees - -pylearn.formulas.boosting: formulas for boosting variants - -etc. - -Fred: It seam that the DeepANN git repository by Xavier G. have part of this as function. - Indexing Convention ~~~~~~~~~~~~~~~~~~~