annotate linear_regression.py @ 100:574f4db76022

restructuring and added test
author Frederic Bastien <bastienf@iro.umontreal.ca>
date Tue, 06 May 2008 14:01:22 -0400
parents c4726e19b8ec
children c4916445e025
rev   line source
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
1
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
2 from learner import *
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
3 from theano import tensor as t
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
4 from compile import Function
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
5 from theano.scalar import as_scalar
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
6
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
7 # this is one of the simplest example of learner, and illustrates
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
8 # the use of theano
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
9 class LinearRegression(Learner):
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
10 """
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
11 Implement linear regression, with or without L2 regularization
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
12 (the former is called Ridge Regression and the latter Ordinary Least Squares).
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
13
92
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
14 The predictor parameters are obtained analytically from the training set.
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
15 Training can proceed sequentially (with multiple calls to update with
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
16 different disjoint subsets of the training sets). After each call to
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
17 update the predictor is ready to be used (and optimized for the union
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
18 of all the training sets passed to update since construction or since
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
19 the last call to forget).
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
20
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
21 The L2 regularization coefficient is obtained analytically.
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
22 For each (input[t],output[t]) pair in a minibatch,::
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
23
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
24 output_t = b + W * input_t
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
25
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
26 where b and W are obtained by minimizing::
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
27
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
28 lambda sum_{ij} W_{ij}^2 + sum_t ||output_t - target_t||^2
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
29
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
30 Let X be the whole training set inputs matrix (one input example per row),
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
31 with the first column full of 1's, and Let Y the whole training set
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
32 targets matrix (one example's target vector per row).
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
33 Let theta = the matrix with b in its first column and W in the others,
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
34 then each theta[:,i] is the solution of the linear system::
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
35
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
36 XtX * theta[:,i] = XtY[:,i]
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
37
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
38 where XtX is a (n_inputs+1)x(n_inputs+1) matrix containing X'*X
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
39 plus lambda on the diagonal except at (0,0),
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
40 and XtY is a (n_inputs+1)*n_outputs matrix containing X'*Y.
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
41
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
42 The fields and attributes expected and produced by use and update are the following:
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
43
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
44 - Input and output fields (example-wise quantities):
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
45
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
46 - 'input' (always expected by use and update as an input_dataset field)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
47 - 'target' (optionally expected by use and update as an input_dataset field)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
48 - 'output' (optionally produced by use as an output dataset field)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
49 - 'squared_error' (optionally produced by use as an output dataset field, needs 'target') = example-wise squared error
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
50
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
51 - optional input attributes (optionally expected as input_dataset attributes)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
52
92
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
53 - optional attributes (optionally expected as input_dataset attributes)
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
54 (warning, this may be dangerous, the 'use' method will use those provided in the
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
55 input_dataset rather than those learned during 'update'; currently no support
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
56 for providing these to update):
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
57
92
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
58 - 'lambda'
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
59 - 'b'
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
60 - 'W'
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
61 - 'regularization_term'
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
62 - 'XtX'
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
63 - 'XtY'
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
64 """
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
65
92
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
66 def attributeNames(self):
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
67 return ["lambda","b","W","regularization_term","XtX","XtY"]
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
68
77
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
69 # definitions specifiques a la regression lineaire:
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
70
92
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
71
77
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
72 def global_inputs(self):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
73 self.lambda = as_scalar(0.,'lambda')
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
74 self.theta = t.matrix('theta')
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
75 self.W = self.theta[:,1:]
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
76 self.b = self.theta[:,0]
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
77 self.XtX = t.matrix('XtX')
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
78 self.XtY = t.matrix('XtY')
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
79
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
80 def global_outputs(self):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
81 self.regularizer = self.lambda * t.dot(self.W,self.W)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
82 self.loss = self.regularizer + t.sum(self.squared_error) # this only makes sense if the whole training set fits in memory in a minibatch
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
83 self.loss_function = Function([self.W,self.lambda,self.squared_error],[self.loss])
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
84
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
85 def initialize(self):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
86 self.XtX.resize((1+self.n_inputs,1+self.n_inputs))
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
87 self.XtY.resize((1+self.n_inputs,self.n_outputs))
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
88 self.XtX.data[:,:]=0
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
89 self.XtY.data[:,:]=0
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
90 numpy.diag(self.XtX.data)[1:]=self.lambda.data
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
91
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
92 def updated_variables(self):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
93 self.new_XtX = self.XtX + t.dot(self.extended_input.T,self.extended_input)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
94 self.new_XtY = self.XtY + t.dot(self.extended_input.T,self.target)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
95 self.new_theta = t.solve(self.XtX,self.XtY)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
96
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
97 def minibatch_wise_inputs(self):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
98 self.input = t.matrix('input') # n_examples x n_inputs
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
99 self.target = t.matrix('target') # n_examples x n_outputs
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
100
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
101 def minibatch_wise_outputs(self):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
102 # self.input is a (n_examples, n_inputs) minibatch matrix
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
103 self.extended_input = t.prepend_one_to_each_row(self.input)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
104 self.output = t.dot(self.input,self.W.T) + self.b # (n_examples , n_outputs) matrix
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
105 self.squared_error = t.sum_within_rows(t.sqr(self.output-self.target)) # (n_examples ) vector
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
106
78
3499918faa9d In the middle of designing TLearner
bengioy@bengiomac.local
parents: 77
diff changeset
107 def attributeNames(self):
77
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
108 return ["lambda","b","W","regularization_term","XtX","XtY"]
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
109
78
3499918faa9d In the middle of designing TLearner
bengioy@bengiomac.local
parents: 77
diff changeset
110 def defaultOutputFields(self, input_fields):
77
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
111 output_fields = ["output"]
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
112 if "target" in input_fields:
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
113 output_fields.append("squared_error")
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
114 return output_fields
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
115
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
116 # poutine generale basee sur ces fonctions
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
117
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
118
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
119 def __init__(self,lambda=0.,max_memory_use=500):
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
120 """
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
121 @type lambda: float
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
122 @param lambda: regularization coefficient
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
123 """
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
124
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
125 W=t.matrix('W')
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
126 # b is a broadcastable row vector (can be replicated into
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
127 # as many rows as there are examples in the minibach)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
128 b=t.row('b')
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
129 minibatch_input = t.matrix('input') # n_examples x n_inputs
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
130 minibatch_target = t.matrix('target') # n_examples x n_outputs
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
131 minibatch_output = t.dot(minibatch_input,W.T) + b # n_examples x n_outputs
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
132 lambda = as_scalar(lambda)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
133 regularizer = self.lambda * t.dot(W,W)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
134 example_squared_error = t.sum_within_rows(t.sqr(minibatch_output-minibatch_target))
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
135 self.output_function = Function([W,b,minibatch_input],[minibatch_output])
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
136 self.squared_error_function = Function([minibatch_output,minibatch_target],[self.example_squared_error])
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
137 self.loss_function = Function([W,squared_error],[self.regularizer + t.sum(self.example_squared_error)])
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
138 self.W=None
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
139 self.b=None
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
140 self.XtX=None
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
141 self.XtY=None
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
142
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
143 def forget(self):
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
144 if self.W:
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
145 self.XtX *= 0
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
146 self.XtY *= 0
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
147
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
148 def use(self,input_dataset,output_fieldnames=None,copy_inputs=True):
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
149 input_fieldnames = input_dataset.fieldNames()
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
150 assert "input" in input_fieldnames
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
151 if not output_fields:
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
152 output_fields = ["output"]
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
153 if "target" in input_fieldnames:
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
154 output_fields += ["squared_error"]
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
155 else:
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
156 if "squared_error" in output_fields or "total_loss" in output_fields:
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
157 assert "target" in input_fieldnames
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
158
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
159 use_functions = []
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
160 for output_fieldname in output_fieldnames:
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
161 if output_fieldname=="output":
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
162 use_functions.append(self.output_function)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
163 elif output_fieldname=="squared_error":
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
164 use_functions.append(lambda self.output_function)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
165
77
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
166 n_examples = len(input_dataset)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
167
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
168 for minibatch in input_dataset.minibatches(minibatch_size=minibatch_size, allow_odd_last_minibatch=True):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
169 use_function(
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
170