# HG changeset patch # User Frederic Bastien # Date 1281620714 14400 # Node ID 216f4ce969b2f345ee448fb53fe855269f6e9280 # Parent 7c4504a4ce1aadf98eef4fd36e747abc3722c2f9 small addition diff -r 7c4504a4ce1a -r 216f4ce969b2 doc/v2_planning.txt --- a/doc/v2_planning.txt Wed Aug 11 21:32:31 2010 -0400 +++ b/doc/v2_planning.txt Thu Aug 12 09:45:14 2010 -0400 @@ -74,7 +74,7 @@ neural nets, though. There are a number of ideas floating around for how to handle classes / -modules (LeDeepNet, pylearn.shared.layers, pynnet) so lets implement as much +modules (LeDeepNet, pylearn.shared.layers, pynnet, DeepAnn) so lets implement as much math as possible in global functions with no classes. There are no models in the wish list that require than a few vectors and matrices to parametrize. Global functions are more reusable than classes. @@ -213,7 +213,9 @@ pylearn.formulas.nnet: formulas for building layers of various kinds, various activation functions, layers which could be plugged with various costs & penalties, and stacked -pylearn.formulas.ae: formulas for auto-encoders, denoising auto-encoder variants, and corruption processes +pylearn.formulas.ae: formulas for auto-encoders and denoising auto-encoder variants + +pylearn.formulas.noise: formulas for corruption processes pylearn.formulas.rbm: energies, free energies, conditional distributions, Gibbs sampling @@ -223,6 +225,8 @@ etc. +Fred: It seam that the DeepANN git repository by Xavier G. have part of this as function. + Indexing Convention ~~~~~~~~~~~~~~~~~~~