annotate mlp.py @ 118:d0a1bd0378c6

Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
author Yoshua Bengio <bengioy@iro.umontreal.ca>
date Wed, 07 May 2008 15:07:56 -0400
parents 88257dfedf8c
children 2ca8dccba270
rev   line source
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
1
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
2 from learner import *
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
3 from theano import tensor as t
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
4 from theano.scalar import as_scalar
118
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
5 from nnet_ops import *
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
6
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
7 # this is one of the simplest example of learner, and illustrates
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
8 # the use of theano
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
9
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
10
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
11 class OneHiddenLayerNNetClassifier(MinibatchUpdatesTLearner):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
12 """
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
13 Implement a straightforward classicial feedforward
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
14 one-hidden-layer neural net, with L2 regularization.
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
15
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
16 The predictor parameters are obtained by minibatch/online gradient descent.
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
17 Training can proceed sequentially (with multiple calls to update with
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
18 different disjoint subsets of the training sets).
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
19
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
20 Hyper-parameters:
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
21 - L2_regularizer
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
22 - learning_rate
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
23 - n_hidden
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
24
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
25 For each (input_t,output_t) pair in a minibatch,::
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
26
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
27 output_activations_t = b2+W2*tanh(b1+W1*input_t)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
28 output_t = softmax(output_activations_t)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
29 output_class_t = argmax(output_activations_t)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
30 class_error_t = 1_{output_class_t != target_t}
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
31 nll_t = -log(output_t[target_t])
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
32
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
33 and the training criterion is::
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
34
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
35 loss = L2_regularizer*(||W1||^2 + ||W2||^2) + sum_t nll_t
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
36
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
37 The parameters are [b1,W1,b2,W2] and are obtained by minimizing the loss by
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
38 stochastic minibatch gradient descent::
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
39
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
40 parameters[i] -= learning_rate * dloss/dparameters[i]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
41
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
42 The fields and attributes expected and produced by use and update are the following:
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
43
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
44 - Input and output fields (example-wise quantities):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
45
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
46 - 'input' (always expected by use and update)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
47 - 'target' (optionally expected by use and always by update)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
48 - 'output' (optionally produced by use)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
49 - 'output_class' (optionally produced by use)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
50 - 'class_error' (optionally produced by use)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
51 - 'nll' (optionally produced by use)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
52
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
53 - optional attributes (optionally expected as input_dataset attributes)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
54 (warning, this may be dangerous, the 'use' method will use those provided in the
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
55 input_dataset rather than those learned during 'update'; currently no support
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
56 for providing these to update):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
57
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
58 - 'L2_regularizer'
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
59 - 'b1'
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
60 - 'W1'
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
61 - 'b2'
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
62 - 'W2'
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
63 - 'parameters' = [b1, W1, b2, W2]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
64 - 'regularization_term'
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
65
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
66 """
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
67
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
68 def attributeNames(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
69 return ["parameters","b1","W2","b2","W2", "L2_regularizer","regularization_term"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
70
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
71 def parameterAttributes(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
72 return ["b1","W1", "b2", "W2"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
73
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
74 def useInputAttributes(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
75 return self.parameterAttributes()
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
76
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
77 def useOutputAttributes(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
78 return []
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
79
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
80 def updateInputAttributes(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
81 return self.parameterAttributes() + ["L2_regularizer"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
82
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
83 def updateMinibatchInputFields(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
84 return ["input","target"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
85
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
86 def updateEndOutputAttributes(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
87 return ["regularization_term"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
88
118
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
89 def lossAttribute(self):
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
90 return "minibatch_criterion"
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
91
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
92 def defaultOutputFields(self, input_fields):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
93 output_fields = ["output", "output_class",]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
94 if "target" in input_fields:
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
95 output_fields += ["class_error", "nll"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
96 return output_fields
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
97
118
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
98 def __init__(self,n_hidden,n_classes,learning_rate,init_range=1.):
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
99 self._n_outputs = n_classes
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
100 self._n_hidden = n_hidden
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
101 self._init_range = init_range
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
102 self.learning_rate = learning_rate # this is the float
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
103 self._learning_rate = t.scalar('learning_rate') # this is the symbol
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
104 self._input = t.matrix('input') # n_examples x n_inputs
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
105 self._target = t.matrix('target') # n_examples x n_outputs
118
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
106 self._L2_regularizer = as_scalar(0.,'L2_regularizer')
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
107 self._W1 = t.matrix('W1')
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
108 self._W2 = t.matrix('W2')
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
109 self._b1 = t.row('b1')
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
110 self._b2 = t.row('b2')
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
111 self._regularizer = self._L2_regularizer * (t.dot(self._W1,self._W1) + t.dot(self._W2,self._W2))
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
112 self._output_activations =self._b2+t.dot(t.tanh(self._b1+t.dot(self._input,self._W1.T)),self._W2.T)
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
113 self._output = t.softmax(self._output_activations)
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
114 self._output_class = t.argmax(self._output,1)
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
115 self._class_error = self._output_class != self._target
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
116 self._nll,self._output = crossentropy_softmax_1hot(self._output_activation,self._target)
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
117 self._minibatch_criterion = self._nll + self._regularizer / t.shape(self._input)[0]
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
118 MinibatchUpdatesTLearner.__init__(self)
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
119
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
120 def allocate(self,minibatch):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
121 minibatch_n_inputs = minibatch["input"].shape[1]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
122 if not self._n_inputs:
118
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
123 self._n_inputs = minibatch_n_inputs
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
124 self.b1 = numpy.zeros(self._n_hidden)
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
125 self.b2 = numpy.zeros(self._n_outputs)
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
126 self.forget()
118
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
127 elif self._n_inputs!=minibatch_n_inputs:
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
128 # if the input changes dimension on the fly, we resize and forget everything
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
129 self.forget()
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
130
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
131 def forget(self):
118
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
132 if self._n_inputs:
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
133 r = self._init_range/math.sqrt(self._n_inputs)
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
134 self.W1 = numpy.random.uniform(low=-r,high=r,
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
135 size=(self._n_hidden,self._n_inputs))
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
136 r = self._init_range/math.sqrt(self._n_hidden)
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
137 self.W2 = numpy.random.uniform(low=-r,high=r,
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
138 size=(self._n_outputs,self._n_hidden))
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
139 self.b1[:]=0
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
140 self.b2[:]=0
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
141
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
142
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
143 class MLP(MinibatchUpdatesTLearner):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
144 """
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
145 Implement a feedforward multi-layer perceptron, with or without L1 and/or L2 regularization.
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
146
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
147 The predictor parameters are obtained by minibatch/online gradient descent.
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
148 Training can proceed sequentially (with multiple calls to update with
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
149 different disjoint subsets of the training sets).
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
150
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
151 Hyper-parameters:
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
152 - L1_regularizer
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
153 - L2_regularizer
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
154 - neuron_sparsity_regularizer
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
155 - initial_learning_rate
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
156 - learning_rate_decrease_rate
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
157 - n_hidden_per_layer (a list of integers)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
158 - activation_function ("sigmoid","tanh", or "ratio")
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
159
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
160 The output/task type (classification, regression, etc.) is obtained by specializing MLP.
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
161
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
162 For each (input[t],output[t]) pair in a minibatch,::
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
163
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
164 activation[0] = input_t
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
165 for k=1 to n_hidden_layers:
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
166 activation[k]=activation_function(b[k]+ W[k]*activation[k-1])
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
167 output_t = output_activation_function(b[n_hidden_layers+1]+W[n_hidden_layers+1]*activation[n_hidden_layers])
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
168
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
169 and the b and W are obtained by minimizing the following by stochastic minibatch gradient descent::
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
170
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
171 L2_regularizer sum_{ijk} W_{kij}^2 + L1_regularizer sum_{kij} |W_{kij}|
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
172 + neuron_sparsity_regularizer sum_{ki} |b_{ki} + infinity|
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
173 - sum_t log P_{output_model}(target_t | output_t)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
174
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
175 The fields and attributes expected and produced by use and update are the following:
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
176
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
177 - Input and output fields (example-wise quantities):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
178
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
179 - 'input' (always expected by use and update)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
180 - 'target' (optionally expected by use and always by update)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
181 - 'output' (optionally produced by use)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
182 - error fields produced by sub-class of MLP
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
183
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
184 - optional attributes (optionally expected as input_dataset attributes)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
185 (warning, this may be dangerous, the 'use' method will use those provided in the
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
186 input_dataset rather than those learned during 'update'; currently no support
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
187 for providing these to update):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
188
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
189 - 'L1_regularizer'
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
190 - 'L2_regularizer'
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
191 - 'b'
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
192 - 'W'
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
193 - 'parameters' = [b[1], W[1], b[2], W[2], ...]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
194 - 'regularization_term'
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
195
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
196 """
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
197
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
198 def attributeNames(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
199 return ["parameters","b","W","L1_regularizer","L2_regularizer","neuron_sparsity_regularizer","regularization_term"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
200
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
201 def useInputAttributes(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
202 return ["b","W"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
203
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
204 def useOutputAttributes(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
205 return []
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
206
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
207 def updateInputAttributes(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
208 return ["b","W","L1_regularizer","L2_regularizer","neuron_sparsity_regularizer"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
209
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
210 def updateMinibatchInputFields(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
211 return ["input","target"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
212
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
213 def updateMinibatchInputAttributes(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
214 return ["b","W"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
215
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
216 def updateMinibatchOutputAttributes(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
217 return ["new_XtX","new_XtY"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
218
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
219 def updateEndInputAttributes(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
220 return ["theta","XtX","XtY"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
221
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
222 def updateEndOutputAttributes(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
223 return ["new_theta","b","W","regularization_term"] # CHECK: WILL b AND W CONTAIN OLD OR NEW THETA? @todo i.e. order of computation = ?
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
224
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
225 def parameterAttributes(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
226 return ["b","W"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
227
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
228 def defaultOutputFields(self, input_fields):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
229 output_fields = ["output"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
230 if "target" in input_fields:
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
231 output_fields.append("squared_error")
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
232 return output_fields
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
233
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
234 def __init__(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
235 self._input = t.matrix('input') # n_examples x n_inputs
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
236 self._target = t.matrix('target') # n_examples x n_outputs
118
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
237 self._L2_regularizer = as_scalar(0.,'L2_regularizer')
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
238 self._theta = t.matrix('theta')
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
239 self._W = self._theta[:,1:]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
240 self._b = self._theta[:,0]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
241 self._XtX = t.matrix('XtX')
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
242 self._XtY = t.matrix('XtY')
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
243 self._extended_input = t.prepend_one_to_each_row(self._input)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
244 self._output = t.dot(self._input,self._W.T) + self._b # (n_examples , n_outputs) matrix
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
245 self._squared_error = t.sum_within_rows(t.sqr(self._output-self._target)) # (n_examples ) vector
118
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
246 self._regularizer = self._L2_regularizer * t.dot(self._W,self._W)
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
247 self._new_XtX = add_inplace(self._XtX,t.dot(self._extended_input.T,self._extended_input))
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
248 self._new_XtY = add_inplace(self._XtY,t.dot(self._extended_input.T,self._target))
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
249 self._new_theta = t.solve_inplace(self._theta,self._XtX,self._XtY)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
250
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
251 OneShotTLearner.__init__(self)
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
252
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
253 def allocate(self,minibatch):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
254 minibatch_n_inputs = minibatch["input"].shape[1]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
255 minibatch_n_outputs = minibatch["target"].shape[1]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
256 if not self._n_inputs:
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
257 self._n_inputs = minibatch_n_inputs
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
258 self._n_outputs = minibatch_n_outputs
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
259 self.XtX = numpy.zeros((1+self._n_inputs,1+self._n_inputs))
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
260 self.XtY = numpy.zeros((1+self._n_inputs,self._n_outputs))
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
261 self.theta = numpy.zeros((self._n_outputs,1+self._n_inputs))
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
262 self.forget()
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
263 elif self._n_inputs!=minibatch_n_inputs or self._n_outputs!=minibatch_n_outputs:
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
264 # if the input or target changes dimension on the fly, we resize and forget everything
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
265 self.forget()
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
266
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
267 def forget(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
268 if self._n_inputs and self._n_outputs:
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
269 self.XtX.resize((1+self.n_inputs,1+self.n_inputs))
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
270 self.XtY.resize((1+self.n_inputs,self.n_outputs))
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
271 self.XtX.data[:,:]=0
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
272 self.XtY.data[:,:]=0
118
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
273 numpy.diag(self.XtX.data)[1:]=self.L2_regularizer
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents:
diff changeset
274