annotate baseline/mlp/mlp_nist.py @ 613:5e481b224117

fix the reading of PNIST dataset following Dumi compression of the data.
author Frederic Bastien <nouiz@nouiz.org>
date Thu, 06 Jan 2011 13:57:05 -0500
parents 868f82777839
children
rev   line source
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
1 """
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
2 This tutorial introduces the multilayer perceptron using Theano.
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
3
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
4 A multilayer perceptron is a logistic regressor where
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
5 instead of feeding the input to the logistic regression you insert a
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
6 intermidiate layer, called the hidden layer, that has a nonlinear
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
7 activation function (usually tanh or sigmoid) . One can use many such
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
8 hidden layers making the architecture deep. The tutorial will also tackle
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
9 the problem of MNIST digit classification.
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
10
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
11 .. math::
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
12
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
13 f(x) = G( b^{(2)} + W^{(2)}( s( b^{(1)} + W^{(1)} x))),
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
14
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
15 References:
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
16
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
17 - textbooks: "Pattern Recognition and Machine Learning" -
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
18 Christopher M. Bishop, section 5
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
19
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
20 TODO: recommended preprocessing, lr ranges, regularization ranges (explain
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
21 to do lr first, then add regularization)
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
22
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
23 """
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
24 __docformat__ = 'restructedtext en'
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
25
355
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
26 import sys
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
27 import pdb
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
28 import numpy
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
29 import pylab
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
30 import theano
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
31 import theano.tensor as T
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
32 import time
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
33 import theano.tensor.nnet
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
34 import pylearn
304
1e4bf5a5b46d added type 2 adaptive learning configurable learning weight + versionning
xaviermuller
parents: 237
diff changeset
35 import theano,pylearn.version,ift6266
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
36 from pylearn.io import filetensor as ft
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
37 from ift6266 import datasets
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
38
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
39 data_path = '/data/lisa/data/nist/by_class/'
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
40
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
41 class MLP(object):
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
42 """Multi-Layer Perceptron Class
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
43
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
44 A multilayer perceptron is a feedforward artificial neural network model
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
45 that has one layer or more of hidden units and nonlinear activations.
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
46 Intermidiate layers usually have as activation function thanh or the
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
47 sigmoid function while the top layer is a softamx layer.
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
48 """
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
49
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
50
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
51
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
52 def __init__(self, input, n_in, n_hidden, n_out,learning_rate,detection_mode):
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
53 """Initialize the parameters for the multilayer perceptron
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
54
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
55 :param input: symbolic variable that describes the input of the
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
56 architecture (one minibatch)
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
57
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
58 :param n_in: number of input units, the dimension of the space in
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
59 which the datapoints lie
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
60
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
61 :param n_hidden: number of hidden units
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
62
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
63 :param n_out: number of output units, the dimension of the space in
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
64 which the labels lie
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
65
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
66 """
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
67
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
68 # initialize the parameters theta = (W1,b1,W2,b2) ; note that this
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
69 # example contains only one hidden layer, but one can have as many
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
70 # layers as he/she wishes, making the network deeper. The only
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
71 # problem making the network deep this way is during learning,
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
72 # backpropagation being unable to move the network from the starting
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
73 # point towards; this is where pre-training helps, giving a good
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
74 # starting point for backpropagation, but more about this in the
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
75 # other tutorials
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
76
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
77 # `W1` is initialized with `W1_values` which is uniformely sampled
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
78 # from -6./sqrt(n_in+n_hidden) and 6./sqrt(n_in+n_hidden)
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
79 # the output of uniform if converted using asarray to dtype
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
80 # theano.config.floatX so that the code is runable on GPU
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
81 W1_values = numpy.asarray( numpy.random.uniform( \
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
82 low = -numpy.sqrt(6./(n_in+n_hidden)), \
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
83 high = numpy.sqrt(6./(n_in+n_hidden)), \
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
84 size = (n_in, n_hidden)), dtype = theano.config.floatX)
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
85 # `W2` is initialized with `W2_values` which is uniformely sampled
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
86 # from -6./sqrt(n_hidden+n_out) and 6./sqrt(n_hidden+n_out)
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
87 # the output of uniform if converted using asarray to dtype
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
88 # theano.config.floatX so that the code is runable on GPU
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
89 W2_values = numpy.asarray( numpy.random.uniform(
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
90 low = -numpy.sqrt(6./(n_hidden+n_out)), \
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
91 high= numpy.sqrt(6./(n_hidden+n_out)),\
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
92 size= (n_hidden, n_out)), dtype = theano.config.floatX)
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
93
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
94 self.W1 = theano.shared( value = W1_values )
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
95 self.b1 = theano.shared( value = numpy.zeros((n_hidden,),
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
96 dtype= theano.config.floatX))
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
97 self.W2 = theano.shared( value = W2_values )
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
98 self.b2 = theano.shared( value = numpy.zeros((n_out,),
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
99 dtype= theano.config.floatX))
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
100
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
101 #include the learning rate in the classifer so
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
102 #we can modify it on the fly when we want
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
103 lr_value=learning_rate
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
104 self.lr=theano.shared(value=lr_value)
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
105 # symbolic expression computing the values of the hidden layer
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
106 self.hidden = T.tanh(T.dot(input, self.W1)+ self.b1)
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
107
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
108
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
109
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
110 # symbolic expression computing the values of the top layer
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
111 if(detection_mode==0):
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
112 self.p_y_given_x= T.nnet.softmax(T.dot(self.hidden, self.W2)+self.b2)
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
113 else:
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
114 self.p_y_given_x= T.nnet.sigmoid(T.dot(self.hidden, self.W2)+self.b2)
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
115
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
116
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
117
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
118 # self.y_out_sig= T.sigmoid(T.dot(self.hidden, self.W2)+self.b2)
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
119
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
120 # compute prediction as class whose probability is maximal in
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
121 # symbolic form
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
122 self.y_pred = T.argmax( self.p_y_given_x, axis =1)
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
123
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
124 # self.y_pred_sig = T.argmax( self.y_out_sig, axis =1)
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
125
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
126
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
127
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
128
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
129
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
130 # L1 norm ; one regularization option is to enforce L1 norm to
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
131 # be small
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
132 self.L1 = abs(self.W1).sum() + abs(self.W2).sum()
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
133
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
134 # square of L2 norm ; one regularization option is to enforce
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
135 # square of L2 norm to be small
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
136 self.L2_sqr = (self.W1**2).sum() + (self.W2**2).sum()
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
137
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
138
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
139
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
140 def negative_log_likelihood(self, y):
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
141 """Return the mean of the negative log-likelihood of the prediction
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
142 of this model under a given target distribution.
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
143
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
144 .. math::
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
145
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
146 \frac{1}{|\mathcal{D}|}\mathcal{L} (\theta=\{W,b\}, \mathcal{D}) =
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
147 \frac{1}{|\mathcal{D}|}\sum_{i=0}^{|\mathcal{D}|} \log(P(Y=y^{(i)}|x^{(i)}, W,b)) \\
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
148 \ell (\theta=\{W,b\}, \mathcal{D})
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
149
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
150
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
151 :param y: corresponds to a vector that gives for each example the
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
152 :correct label
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
153 """
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
154 return -T.mean(T.log(self.p_y_given_x)[T.arange(y.shape[0]),y])
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
155
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
156
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
157 def cross_entropy(self, y):
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
158 return -T.mean(T.log(self.p_y_given_x)[T.arange(y.shape[0]),y]+T.sum(T.log(1-self.p_y_given_x), axis=1)-T.log(1-self.p_y_given_x)[T.arange(y.shape[0]),y])
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
159
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
160
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
161
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
162
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
163
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
164
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
165 def errors(self, y):
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
166 """Return a float representing the number of errors in the minibatch
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
167 over the total number of examples of the minibatch
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
168 """
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
169
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
170 # check if y has same dimension of y_pred
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
171 if y.ndim != self.y_pred.ndim:
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
172 raise TypeError('y should have the same shape as self.y_pred',
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
173 ('y', target.type, 'y_pred', self.y_pred.type))
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
174 # check if y is of the correct datatype
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
175 if y.dtype.startswith('int'):
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
176 # the T.neq operator returns a vector of 0s and 1s, where 1
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
177 # represents a mistake in prediction
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
178 return T.mean(T.neq(self.y_pred, y))
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
179 else:
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
180 raise NotImplementedError()
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
181
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
182 def mlp_get_nist_error(model_name='/u/mullerx/ift6266h10_sandbox_db/xvm_final_lr1_p073/8/best_model.npy.npz',
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
183 data_set=0):
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
184
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
185
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
186
404
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
187
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
188
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
189
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
190
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
191 # load the data set and create an mlp based on the dimensions of the model
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
192 model=numpy.load(model_name)
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
193 W1=model['W1']
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
194 W2=model['W2']
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
195 b1=model['b1']
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
196 b2=model['b2']
404
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
197
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
198 total_error_count=0.0
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
199 total_exemple_count=0.0
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
200
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
201 nb_error_count=0.0
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
202 nb_exemple_count=0.0
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
203
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
204 char_error_count=0.0
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
205 char_exemple_count=0.0
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
206
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
207 min_error_count=0.0
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
208 min_exemple_count=0.0
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
209
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
210 maj_error_count=0.0
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
211 maj_exemple_count=0.0
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
212
414
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
213 vtotal_error_count=0.0
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
214 vtotal_exemple_count=0.0
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
215
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
216 vnb_error_count=0.0
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
217 vnb_exemple_count=0.0
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
218
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
219 vchar_error_count=0.0
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
220 vchar_exemple_count=0.0
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
221
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
222 vmin_error_count=0.0
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
223 vmin_exemple_count=0.0
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
224
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
225 vmaj_error_count=0.0
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
226 vmaj_exemple_count=0.0
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
227
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
228 nbc_error_count=0.0
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
229 vnbc_error_count=0.0
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
230
404
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
231
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
232
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
233 if data_set==0:
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
234 print 'using nist'
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
235 dataset=datasets.nist_all()
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
236 elif data_set==1:
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
237 print 'using p07'
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
238 dataset=datasets.nist_P07()
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
239 elif data_set==2:
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
240 print 'using pnist'
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
241 dataset=datasets.PNIST07()
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
242
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
243
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
244
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
245
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
246
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
247 #get the test error
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
248 #use a batch size of 1 so we can get the sub-class error
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
249 #without messing with matrices (will be upgraded later)
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
250 test_score=0
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
251 temp=0
404
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
252 for xt,yt in dataset.test(1):
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
253
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
254 total_exemple_count = total_exemple_count +1
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
255 #get activation for layer 1
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
256 a0=numpy.dot(numpy.transpose(W1),numpy.transpose(xt[0])) + b1
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
257 #add non linear function to layer 1 activation
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
258 a0_out=numpy.tanh(a0)
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
259
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
260 #get activation for output layer
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
261 a1= numpy.dot(numpy.transpose(W2),a0_out) + b2
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
262 #add non linear function for output activation (softmax)
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
263 a1_exp = numpy.exp(a1)
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
264 sum_a1=numpy.sum(a1_exp)
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
265 a1_out=a1_exp/sum_a1
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
266
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
267 predicted_class=numpy.argmax(a1_out)
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
268 wanted_class=yt[0]
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
269 if(predicted_class!=wanted_class):
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
270 total_error_count = total_error_count +1
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
271
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
272
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
273 if(not(predicted_class==wanted_class or ( (((predicted_class+26)==wanted_class) or ((predicted_class-26)==wanted_class)) and wanted_class>9) )):
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
274 nbc_error_count = nbc_error_count +1
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
275
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
276
404
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
277 #treat digit error
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
278 if(wanted_class<10):
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
279 nb_exemple_count=nb_exemple_count + 1
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
280 predicted_class=numpy.argmax(a1_out[0:10])
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
281 if(predicted_class!=wanted_class):
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
282 nb_error_count = nb_error_count +1
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
283
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
284 if(wanted_class>9):
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
285 char_exemple_count=char_exemple_count + 1
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
286 predicted_class=numpy.argmax(a1_out[10:62])+10
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
287 if((predicted_class!=wanted_class) and ((predicted_class+26)!=wanted_class) and ((predicted_class-26)!=wanted_class)):
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
288 char_error_count = char_error_count +1
405
195f95c3d461 corrected bug where we inverted min and maj
xaviermuller
parents: 404
diff changeset
289
195f95c3d461 corrected bug where we inverted min and maj
xaviermuller
parents: 404
diff changeset
290 #minuscule
195f95c3d461 corrected bug where we inverted min and maj
xaviermuller
parents: 404
diff changeset
291 if(wanted_class>9 and wanted_class<36):
195f95c3d461 corrected bug where we inverted min and maj
xaviermuller
parents: 404
diff changeset
292 maj_exemple_count=maj_exemple_count + 1
195f95c3d461 corrected bug where we inverted min and maj
xaviermuller
parents: 404
diff changeset
293 predicted_class=numpy.argmax(a1_out[10:35])+10
195f95c3d461 corrected bug where we inverted min and maj
xaviermuller
parents: 404
diff changeset
294 if(predicted_class!=wanted_class):
195f95c3d461 corrected bug where we inverted min and maj
xaviermuller
parents: 404
diff changeset
295 maj_error_count = maj_error_count +1
195f95c3d461 corrected bug where we inverted min and maj
xaviermuller
parents: 404
diff changeset
296 #majuscule
195f95c3d461 corrected bug where we inverted min and maj
xaviermuller
parents: 404
diff changeset
297 if(wanted_class>35):
195f95c3d461 corrected bug where we inverted min and maj
xaviermuller
parents: 404
diff changeset
298 min_exemple_count=min_exemple_count + 1
195f95c3d461 corrected bug where we inverted min and maj
xaviermuller
parents: 404
diff changeset
299 predicted_class=numpy.argmax(a1_out[36:62])+36
195f95c3d461 corrected bug where we inverted min and maj
xaviermuller
parents: 404
diff changeset
300 if(predicted_class!=wanted_class):
195f95c3d461 corrected bug where we inverted min and maj
xaviermuller
parents: 404
diff changeset
301 min_error_count = min_error_count +1
404
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
302
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
303
414
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
304
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
305 vtest_score=0
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
306 vtemp=0
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
307 for xt,yt in dataset.valid(1):
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
308
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
309 vtotal_exemple_count = vtotal_exemple_count +1
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
310 #get activation for layer 1
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
311 a0=numpy.dot(numpy.transpose(W1),numpy.transpose(xt[0])) + b1
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
312 #add non linear function to layer 1 activation
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
313 a0_out=numpy.tanh(a0)
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
314
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
315 #get activation for output layer
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
316 a1= numpy.dot(numpy.transpose(W2),a0_out) + b2
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
317 #add non linear function for output activation (softmax)
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
318 a1_exp = numpy.exp(a1)
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
319 sum_a1=numpy.sum(a1_exp)
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
320 a1_out=a1_exp/sum_a1
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
321
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
322 predicted_class=numpy.argmax(a1_out)
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
323 wanted_class=yt[0]
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
324 if(predicted_class!=wanted_class):
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
325 vtotal_error_count = vtotal_error_count +1
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
326
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
327 if(not(predicted_class==wanted_class or ( (((predicted_class+26)==wanted_class) or ((predicted_class-26)==wanted_class)) and wanted_class>9) )):
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
328 vnbc_error_count = nbc_error_count +1
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
329
414
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
330 #treat digit error
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
331 if(wanted_class<10):
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
332 vnb_exemple_count=vnb_exemple_count + 1
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
333 predicted_class=numpy.argmax(a1_out[0:10])
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
334 if(predicted_class!=wanted_class):
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
335 vnb_error_count = vnb_error_count +1
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
336
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
337 if(wanted_class>9):
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
338 vchar_exemple_count=vchar_exemple_count + 1
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
339 predicted_class=numpy.argmax(a1_out[10:62])+10
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
340 if((predicted_class!=wanted_class) and ((predicted_class+26)!=wanted_class) and ((predicted_class-26)!=wanted_class)):
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
341 vchar_error_count = vchar_error_count +1
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
342
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
343 #minuscule
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
344 if(wanted_class>9 and wanted_class<36):
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
345 vmaj_exemple_count=vmaj_exemple_count + 1
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
346 predicted_class=numpy.argmax(a1_out[10:35])+10
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
347 if(predicted_class!=wanted_class):
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
348 vmaj_error_count = vmaj_error_count +1
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
349 #majuscule
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
350 if(wanted_class>35):
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
351 vmin_exemple_count=vmin_exemple_count + 1
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
352 predicted_class=numpy.argmax(a1_out[36:62])+36
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
353 if(predicted_class!=wanted_class):
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
354 vmin_error_count = vmin_error_count +1
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
355
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
356
404
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
357 print (('total error = %f') % ((total_error_count/total_exemple_count)*100.0))
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
358 print (('number error = %f') % ((nb_error_count/nb_exemple_count)*100.0))
1509b9bba4cc added digit/char error
xaviermuller
parents: 378
diff changeset
359 print (('char error = %f') % ((char_error_count/char_exemple_count)*100.0))
405
195f95c3d461 corrected bug where we inverted min and maj
xaviermuller
parents: 404
diff changeset
360 print (('min error = %f') % ((min_error_count/min_exemple_count)*100.0))
195f95c3d461 corrected bug where we inverted min and maj
xaviermuller
parents: 404
diff changeset
361 print (('maj error = %f') % ((maj_error_count/maj_exemple_count)*100.0))
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
362 print (('36 error = %f') % ((nbc_error_count/total_exemple_count)*100.0))
414
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
363
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
364 print (('valid total error = %f') % ((vtotal_error_count/vtotal_exemple_count)*100.0))
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
365 print (('valid number error = %f') % ((vnb_error_count/vnb_exemple_count)*100.0))
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
366 print (('valid char error = %f') % ((vchar_error_count/vchar_exemple_count)*100.0))
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
367 print (('valid min error = %f') % ((vmin_error_count/vmin_exemple_count)*100.0))
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
368 print (('valid maj error = %f') % ((vmaj_error_count/vmaj_exemple_count)*100.0))
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
369 print (('valid 36 error = %f') % ((vnbc_error_count/vtotal_exemple_count)*100.0))
414
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
370
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
371 print (('num total = %d,%d') % (total_exemple_count,total_error_count))
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
372 print (('num nb = %d,%d') % (nb_exemple_count,nb_error_count))
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
373 print (('num min = %d,%d') % (min_exemple_count,min_error_count))
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
374 print (('num maj = %d,%d') % (maj_exemple_count,maj_error_count))
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
375 print (('num char = %d,%d') % (char_exemple_count,char_error_count))
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
376
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
377
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
378
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
379 total_error_count/=total_exemple_count
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
380 nb_error_count/=nb_exemple_count
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
381 char_error_count/=char_exemple_count
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
382 min_error_count/=min_exemple_count
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
383 maj_error_count/=maj_exemple_count
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
384 nbc_error_count/=total_exemple_count
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
385
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
386 vtotal_error_count/=vtotal_exemple_count
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
387 vnb_error_count/=vnb_exemple_count
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
388 vchar_error_count/=vchar_exemple_count
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
389 vmin_error_count/=vmin_exemple_count
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
390 vmaj_error_count/=vmaj_exemple_count
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
391 vnbc_error_count/=vtotal_exemple_count
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
392
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
393
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
394
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
395 return (total_error_count,nb_error_count,char_error_count,min_error_count,maj_error_count,nbc_error_count,\
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
396 vtotal_error_count,vnb_error_count,vchar_error_count,vmin_error_count,vmaj_error_count,vnbc_error_count)
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
397
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
398 def jobman_get_error(state,channel):
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
399 (all_t_error,nb_t_error,char_t_error,min_t_error,maj_t_error,nbc_t_error,
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
400 all_v_error,nb_v_error,char_v_error,min_v_error,maj_v_error,nbc_v_error)=mlp_get_nist_error(data_set=state.data_set,\
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
401 model_name=state.model_name)
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
402
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
403 state.all_t_error=all_t_error*100.0
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
404 state.nb_t_error=nb_t_error*100.0
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
405 state.char_t_error=char_t_error*100.0
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
406 state.min_t_error=min_t_error*100.0
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
407 state.maj_t_error=maj_t_error*100.0
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
408 state.nbc_t_error=nbc_t_error*100.0
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
409
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
410 state.all_v_error=all_v_error*100.0
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
411 state.nb_v_error=nb_v_error*100.0
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
412 state.char_v_error=char_v_error*100.0
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
413 state.min_v_error=min_v_error*100.0
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
414 state.maj_v_error=maj_v_error*100.0
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
415 state.nbc_v_error=nbc_v_error*100.0
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
416
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
417 pylearn.version.record_versions(state,[theano,ift6266,pylearn])
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
418 return channel.COMPLETE
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
419
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
420
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
421
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
422
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
423
304
1e4bf5a5b46d added type 2 adaptive learning configurable learning weight + versionning
xaviermuller
parents: 237
diff changeset
424 def mlp_full_nist( verbose = 1,\
145
8ceaaf812891 changed adaptive lr flag from bool to int for jobman issues
XavierMuller
parents: 143
diff changeset
425 adaptive_lr = 0,\
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
426 data_set=0,\
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
427 learning_rate=0.01,\
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
428 L1_reg = 0.00,\
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
429 L2_reg = 0.0001,\
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
430 nb_max_exemples=1000000,\
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
431 batch_size=20,\
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
432 nb_hidden = 30,\
212
e390b0454515 added classic lr time decay and py code to calculate the error based on a saved model
xaviermuller
parents: 169
diff changeset
433 nb_targets = 62,
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
434 tau=1e6,\
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
435 lr_t2_factor=0.5,\
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
436 init_model=0,\
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
437 channel=0,\
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
438 detection_mode=0):
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
439
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
440
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
441 if channel!=0:
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
442 channel.save()
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
443 configuration = [learning_rate,nb_max_exemples,nb_hidden,adaptive_lr]
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
444
212
e390b0454515 added classic lr time decay and py code to calculate the error based on a saved model
xaviermuller
parents: 169
diff changeset
445 #save initial learning rate if classical adaptive lr is used
e390b0454515 added classic lr time decay and py code to calculate the error based on a saved model
xaviermuller
parents: 169
diff changeset
446 initial_lr=learning_rate
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
447 max_div_count=1000
414
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
448 optimal_test_error=0
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
449
212
e390b0454515 added classic lr time decay and py code to calculate the error based on a saved model
xaviermuller
parents: 169
diff changeset
450
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
451 total_validation_error_list = []
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
452 total_train_error_list = []
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
453 learning_rate_list=[]
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
454 best_training_error=float('inf');
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
455 divergence_flag_list=[]
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
456
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
457 if data_set==0:
378
60a4432b8071 added initial model for weights in jobman
xaviermuller
parents: 355
diff changeset
458 print 'using nist'
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
459 dataset=datasets.nist_all()
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
460 elif data_set==1:
378
60a4432b8071 added initial model for weights in jobman
xaviermuller
parents: 355
diff changeset
461 print 'using p07'
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
462 dataset=datasets.nist_P07()
349
22efb4968054 added pnist support, will check in code for data set iterator later
xaviermuller
parents: 338
diff changeset
463 elif data_set==2:
378
60a4432b8071 added initial model for weights in jobman
xaviermuller
parents: 355
diff changeset
464 print 'using pnist'
349
22efb4968054 added pnist support, will check in code for data set iterator later
xaviermuller
parents: 338
diff changeset
465 dataset=datasets.PNIST07()
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
466
212
e390b0454515 added classic lr time decay and py code to calculate the error based on a saved model
xaviermuller
parents: 169
diff changeset
467
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
468
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
469
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
470 ishape = (32,32) # this is the size of NIST images
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
471
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
472 # allocate symbolic variables for the data
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
473 x = T.fmatrix() # the data is presented as rasterized images
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
474 y = T.lvector() # the labels are presented as 1D vector of
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
475 # [long int] labels
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
476
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
477
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
478 # construct the logistic regression class
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
479 classifier = MLP( input=x,\
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
480 n_in=32*32,\
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
481 n_hidden=nb_hidden,\
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
482 n_out=nb_targets,
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
483 learning_rate=learning_rate,
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
484 detection_mode=detection_mode)
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
485
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
486
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
487 # check if we want to initialise the weights with a previously calculated model
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
488 # dimensions must be consistent between old model and current configuration!!!!!! (nb_hidden and nb_targets)
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
489 if init_model!=0:
378
60a4432b8071 added initial model for weights in jobman
xaviermuller
parents: 355
diff changeset
490 print 'using old model'
60a4432b8071 added initial model for weights in jobman
xaviermuller
parents: 355
diff changeset
491 print init_model
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
492 old_model=numpy.load(init_model)
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
493 classifier.W1.value=old_model['W1']
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
494 classifier.W2.value=old_model['W2']
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
495 classifier.b1.value=old_model['b1']
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
496 classifier.b2.value=old_model['b2']
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
497
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
498
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
499 # the cost we minimize during training is the negative log likelihood of
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
500 # the model plus the regularization terms (L1 and L2); cost is expressed
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
501 # here symbolically
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
502 if(detection_mode==0):
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
503 cost = classifier.negative_log_likelihood(y) \
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
504 + L1_reg * classifier.L1 \
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
505 + L2_reg * classifier.L2_sqr
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
506 else:
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
507 cost = classifier.cross_entropy(y) \
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
508 + L1_reg * classifier.L1 \
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
509 + L2_reg * classifier.L2_sqr
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
510
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
511
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
512 # compiling a theano function that computes the mistakes that are made by
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
513 # the model on a minibatch
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
514 test_model = theano.function([x,y], classifier.errors(y))
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
515
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
516 # compute the gradient of cost with respect to theta = (W1, b1, W2, b2)
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
517 g_W1 = T.grad(cost, classifier.W1)
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
518 g_b1 = T.grad(cost, classifier.b1)
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
519 g_W2 = T.grad(cost, classifier.W2)
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
520 g_b2 = T.grad(cost, classifier.b2)
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
521
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
522 # specify how to update the parameters of the model as a dictionary
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
523 updates = \
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
524 { classifier.W1: classifier.W1 - classifier.lr*g_W1 \
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
525 , classifier.b1: classifier.b1 - classifier.lr*g_b1 \
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
526 , classifier.W2: classifier.W2 - classifier.lr*g_W2 \
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
527 , classifier.b2: classifier.b2 - classifier.lr*g_b2 }
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
528
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
529 # compiling a theano function `train_model` that returns the cost, but in
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
530 # the same time updates the parameter of the model based on the rules
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
531 # defined in `updates`
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
532 train_model = theano.function([x, y], cost, updates = updates )
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
533
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
534
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
535
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
536
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
537
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
538
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
539
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
540
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
541
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
542 #conditions for stopping the adaptation:
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
543 #1) we have reached nb_max_exemples (this is rounded up to be a multiple of the train size so we always do at least 1 epoch)
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
544 #2) validation error is going up twice in a row(probable overfitting)
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
545
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
546 # This means we no longer stop on slow convergence as low learning rates stopped
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
547 # too fast but instead we will wait for the valid error going up 3 times in a row
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
548 # We save the curb of the validation error so we can always go back to check on it
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
549 # and we save the absolute best model anyway, so we might as well explore
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
550 # a bit when diverging
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
551
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
552 #approximate number of samples in the nist training set
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
553 #this is just to have a validation frequency
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
554 #roughly proportionnal to the original nist training set
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
555 n_minibatches = 650000/batch_size
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
556
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
557
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
558 patience =2*nb_max_exemples/batch_size #in units of minibatch
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
559 validation_frequency = n_minibatches/4
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
560
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
561
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
562
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
563
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
564
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
565 best_validation_loss = float('inf')
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
566 best_iter = 0
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
567 test_score = 0.
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
568 start_time = time.clock()
212
e390b0454515 added classic lr time decay and py code to calculate the error based on a saved model
xaviermuller
parents: 169
diff changeset
569 time_n=0 #in unit of exemples
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
570 minibatch_index=0
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
571 epoch=0
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
572 temp=0
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
573 divergence_flag=0
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
574
212
e390b0454515 added classic lr time decay and py code to calculate the error based on a saved model
xaviermuller
parents: 169
diff changeset
575
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
576
355
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
577
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
578 print 'starting training'
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
579 sys.stdout.flush()
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
580 while(minibatch_index*batch_size<nb_max_exemples):
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
581
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
582 for x, y in dataset.train(batch_size):
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
583
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
584 #if we are using the classic learning rate deacay, adjust it before training of current mini-batch
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
585 if adaptive_lr==2:
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
586 classifier.lr.value = tau*initial_lr/(tau+time_n)
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
587
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
588
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
589 #train model
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
590 cost_ij = train_model(x,y)
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
591 if (minibatch_index) % validation_frequency == 0:
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
592 #save the current learning rate
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
593 learning_rate_list.append(classifier.lr.value)
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
594 divergence_flag_list.append(divergence_flag)
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
595
355
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
596
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
597
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
598 # compute the validation error
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
599 this_validation_loss = 0.
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
600 temp=0
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
601 for xv,yv in dataset.valid(1):
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
602 # sum up the errors for each minibatch
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
603 this_validation_loss += test_model(xv,yv)
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
604 temp=temp+1
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
605 # get the average by dividing with the number of minibatches
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
606 this_validation_loss /= temp
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
607 #save the validation loss
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
608 total_validation_error_list.append(this_validation_loss)
355
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
609
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
610 print(('epoch %i, minibatch %i, learning rate %f current validation error %f ') %
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
611 (epoch, minibatch_index+1,classifier.lr.value,
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
612 this_validation_loss*100.))
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
613 sys.stdout.flush()
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
614
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
615 #save temp results to check during training
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
616 numpy.savez('temp_results.npy',config=configuration,total_validation_error_list=total_validation_error_list,\
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
617 learning_rate_list=learning_rate_list, divergence_flag_list=divergence_flag_list)
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
618
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
619 # if we got the best validation score until now
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
620 if this_validation_loss < best_validation_loss:
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
621 # save best validation score and iteration number
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
622 best_validation_loss = this_validation_loss
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
623 best_iter = minibatch_index
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
624 #reset divergence flag
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
625 divergence_flag=0
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
626
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
627 #save the best model. Overwrite the current saved best model so
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
628 #we only keep the best
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
629 numpy.savez('best_model.npy', config=configuration, W1=classifier.W1.value, W2=classifier.W2.value, b1=classifier.b1.value,\
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
630 b2=classifier.b2.value, minibatch_index=minibatch_index)
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
631
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
632 # test it on the test set
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
633 test_score = 0.
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
634 temp =0
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
635 for xt,yt in dataset.test(batch_size):
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
636 test_score += test_model(xt,yt)
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
637 temp = temp+1
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
638 test_score /= temp
355
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
639
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
640 print(('epoch %i, minibatch %i, test error of best '
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
641 'model %f %%') %
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
642 (epoch, minibatch_index+1,
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
643 test_score*100.))
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
644 sys.stdout.flush()
414
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
645 optimal_test_error=test_score
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
646
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
647 # if the validation error is going up, we are overfitting (or oscillating)
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
648 # check if we are allowed to continue and if we will adjust the learning rate
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
649 elif this_validation_loss >= best_validation_loss:
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
650
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
651
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
652 # In non-classic learning rate decay, we modify the weight only when
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
653 # validation error is going up
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
654 if adaptive_lr==1:
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
655 classifier.lr.value=classifier.lr.value*lr_t2_factor
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
656
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
657
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
658 #cap the patience so we are allowed to diverge max_div_count times
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
659 #if we are going up max_div_count in a row, we will stop immediatelty by modifying the patience
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
660 divergence_flag = divergence_flag +1
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
661
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
662
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
663 #calculate the test error at this point and exit
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
664 # test it on the test set
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
665 test_score = 0.
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
666 temp=0
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
667 for xt,yt in dataset.test(batch_size):
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
668 test_score += test_model(xt,yt)
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
669 temp=temp+1
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
670 test_score /= temp
355
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
671
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
672 print ' validation error is going up, possibly stopping soon'
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
673 print((' epoch %i, minibatch %i, test error of best '
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
674 'model %f %%') %
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
675 (epoch, minibatch_index+1,
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
676 test_score*100.))
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
677 sys.stdout.flush()
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
678
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
679
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
680
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
681 # check early stop condition
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
682 if divergence_flag==max_div_count:
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
683 minibatch_index=nb_max_exemples
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
684 print 'we have diverged, early stopping kicks in'
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
685 break
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
686
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
687 #check if we have seen enough exemples
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
688 #force one epoch at least
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
689 if epoch>0 and minibatch_index*batch_size>nb_max_exemples:
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
690 break
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
691
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
692
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
693
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
694
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
695
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
696 time_n= time_n + batch_size
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
697 minibatch_index = minibatch_index + 1
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
698
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
699 # we have finished looping through the training set
322
743907366476 code clean up in progress
xaviermuller
parents: 304
diff changeset
700 epoch = epoch+1
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
701 end_time = time.clock()
355
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
702
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
703 print(('Optimization complete. Best validation score of %f %% '
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
704 'obtained at iteration %i, with test performance %f %%') %
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
705 (best_validation_loss * 100., best_iter, test_score*100.))
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
706 print ('The code ran for %f minutes' % ((end_time-start_time)/60.))
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
707 print minibatch_index
76b7182dd32e added support for pnist in iterator. corrected a print bug in mlp
xaviermuller
parents: 349
diff changeset
708 sys.stdout.flush()
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
709
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
710 #save the model and the weights
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
711 numpy.savez('model.npy', config=configuration, W1=classifier.W1.value,W2=classifier.W2.value, b1=classifier.b1.value,b2=classifier.b2.value)
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
712 numpy.savez('results.npy',config=configuration,total_train_error_list=total_train_error_list,total_validation_error_list=total_validation_error_list,\
323
7a7615f940e8 finished code clean up and testing
xaviermuller
parents: 322
diff changeset
713 learning_rate_list=learning_rate_list, divergence_flag_list=divergence_flag_list)
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
714
414
3dba84c0fbc1 saving test score from best validation score in db now
xaviermuller
parents: 405
diff changeset
715 return (best_training_error*100.0,best_validation_loss * 100.,optimal_test_error*100.,best_iter*batch_size,(end_time-start_time)/60)
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
716
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
717
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
718 if __name__ == '__main__':
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
719 mlp_full_mnist()
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
720
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
721 def jobman_mlp_full_nist(state,channel):
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
722 (train_error,validation_error,test_error,nb_exemples,time)=mlp_full_nist(learning_rate=state.learning_rate,\
304
1e4bf5a5b46d added type 2 adaptive learning configurable learning weight + versionning
xaviermuller
parents: 237
diff changeset
723 nb_max_exemples=state.nb_max_exemples,\
1e4bf5a5b46d added type 2 adaptive learning configurable learning weight + versionning
xaviermuller
parents: 237
diff changeset
724 nb_hidden=state.nb_hidden,\
1e4bf5a5b46d added type 2 adaptive learning configurable learning weight + versionning
xaviermuller
parents: 237
diff changeset
725 adaptive_lr=state.adaptive_lr,\
1e4bf5a5b46d added type 2 adaptive learning configurable learning weight + versionning
xaviermuller
parents: 237
diff changeset
726 tau=state.tau,\
1e4bf5a5b46d added type 2 adaptive learning configurable learning weight + versionning
xaviermuller
parents: 237
diff changeset
727 verbose = state.verbose,\
324
1763c64030d1 fixed bug in jobman interface
xaviermuller
parents: 323
diff changeset
728 lr_t2_factor=state.lr_t2_factor,
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
729 data_set=state.data_set,
378
60a4432b8071 added initial model for weights in jobman
xaviermuller
parents: 355
diff changeset
730 init_model=state.init_model,
445
868f82777839 added jobman all test + val error and sigmoid output
xaviermuller
parents: 414
diff changeset
731 detection_mode = state.detection_mode,\
338
fca22114bb23 added async save, restart from old model and independant error calculation based on Arnaud's iterator
xaviermuller
parents: 324
diff changeset
732 channel=channel)
143
f341a4efb44a added adaptive lr, weight file save, traine error and error curves
XavierMuller
parents: 110
diff changeset
733 state.train_error=train_error
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
734 state.validation_error=validation_error
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
735 state.test_error=test_error
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
736 state.nb_exemples=nb_exemples
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
737 state.time=time
304
1e4bf5a5b46d added type 2 adaptive learning configurable learning weight + versionning
xaviermuller
parents: 237
diff changeset
738 pylearn.version.record_versions(state,[theano,ift6266,pylearn])
110
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
739 return channel.COMPLETE
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
740
93b4b84d86cf added simple mlp file
XavierMuller
parents:
diff changeset
741