changeset 947:216f4ce969b2

small addition
author Frederic Bastien <nouiz@nouiz.org>
date Thu, 12 Aug 2010 09:45:14 -0400
parents 7c4504a4ce1a
children d944e1c26a57
files doc/v2_planning.txt
diffstat 1 files changed, 6 insertions(+), 2 deletions(-) [+]
line wrap: on
line diff
--- a/doc/v2_planning.txt	Wed Aug 11 21:32:31 2010 -0400
+++ b/doc/v2_planning.txt	Thu Aug 12 09:45:14 2010 -0400
@@ -74,7 +74,7 @@
 neural nets, though.
 
 There are a number of ideas floating around for how to handle classes /
-modules (LeDeepNet, pylearn.shared.layers, pynnet) so lets implement as much
+modules (LeDeepNet, pylearn.shared.layers, pynnet, DeepAnn) so lets implement as much
 math as possible in global functions with no classes.  There are no models in
 the wish list that require than a few vectors and matrices to parametrize.
 Global functions are more reusable than classes.
@@ -213,7 +213,9 @@
 pylearn.formulas.nnet: formulas for building layers of various kinds, various activation functions,
 layers which could be plugged with various costs & penalties, and stacked
 
-pylearn.formulas.ae: formulas for auto-encoders, denoising auto-encoder variants, and corruption processes
+pylearn.formulas.ae: formulas for auto-encoders and denoising auto-encoder variants
+
+pylearn.formulas.noise: formulas for corruption processes
 
 pylearn.formulas.rbm: energies, free energies, conditional distributions, Gibbs sampling
 
@@ -223,6 +225,8 @@
 
 etc.
 
+Fred: It seam that the DeepANN git repository by Xavier G. have part of this as function.
+
 Indexing Convention
 ~~~~~~~~~~~~~~~~~~~