annotate linear_regression.py @ 77:1e2bb5bad636

toying with different ways to implement learners
author bengioy@bengiomac.local
date Sun, 04 May 2008 15:09:22 -0400
parents 90e4c0784d6e
children 3499918faa9d
rev   line source
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
1
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
2 from learner import *
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
3 from theano import tensor as t
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
4 from compile import Function
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
5 from theano.scalar import as_scalar
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
6
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
7 # this is one of the simplest example of learner, and illustrates
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
8 # the use of theano
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
9 class LinearRegression(Learner):
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
10 """
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
11 Implement linear regression, with or without L2 regularization
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
12 (the former is called Ridge Regression and the latter Ordinary Least Squares).
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
13
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
14 The predictor is obtained analytically.
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
15
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
16 The L2 regularization coefficient is obtained analytically.
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
17 For each (input[t],output[t]) pair in a minibatch,::
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
18
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
19 output_t = b + W * input_t
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
20
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
21 where b and W are obtained by minimizing::
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
22
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
23 lambda sum_{ij} W_{ij}^2 + sum_t ||output_t - target_t||^2
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
24
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
25 Let X be the whole training set inputs matrix (one input example per row),
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
26 with the first column full of 1's, and Let Y the whole training set
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
27 targets matrix (one example's target vector per row).
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
28 Let theta = the matrix with b in its first column and W in the others,
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
29 then each theta[:,i] is the solution of the linear system::
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
30
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
31 XtX * theta[:,i] = XtY[:,i]
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
32
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
33 where XtX is a (n_inputs+1)x(n_inputs+1) matrix containing X'*X
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
34 plus lambda on the diagonal except at (0,0),
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
35 and XtY is a (n_inputs+1)*n_outputs matrix containing X'*Y.
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
36
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
37 The fields and attributes expected and produced by use and update are the following:
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
38
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
39 - Input and output fields (example-wise quantities):
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
40
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
41 - 'input' (always expected by use and update as an input_dataset field)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
42 - 'target' (optionally expected by use and update as an input_dataset field)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
43 - 'output' (optionally produced by use as an output dataset field)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
44 - 'squared_error' (optionally produced by use as an output dataset field, needs 'target') = example-wise squared error
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
45
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
46 - optional input attributes (optionally expected as input_dataset attributes)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
47
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
48 - 'lambda' (only used by update)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
49 - 'b' (only used by use)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
50 - 'W' (only used by use)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
51
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
52 - optional output attributes (available in self and optionally in output dataset)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
53
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
54 - 'b' (only set by update)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
55 - 'W' (only set by update)
77
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
56 - 'regularization_term' (only set by update)
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
57 - 'XtX' (only set by update)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
58 - 'XtY' (only set by update)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
59
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
60 """
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
61
77
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
62 # definitions specifiques a la regression lineaire:
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
63
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
64 def global_inputs(self):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
65 self.lambda = as_scalar(0.,'lambda')
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
66 self.theta = t.matrix('theta')
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
67 self.W = self.theta[:,1:]
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
68 self.b = self.theta[:,0]
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
69 self.XtX = t.matrix('XtX')
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
70 self.XtY = t.matrix('XtY')
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
71
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
72 def global_outputs(self):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
73 self.regularizer = self.lambda * t.dot(self.W,self.W)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
74 self.loss = self.regularizer + t.sum(self.squared_error) # this only makes sense if the whole training set fits in memory in a minibatch
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
75 self.loss_function = Function([self.W,self.lambda,self.squared_error],[self.loss])
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
76
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
77 def initialize(self):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
78 self.XtX.resize((1+self.n_inputs,1+self.n_inputs))
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
79 self.XtY.resize((1+self.n_inputs,self.n_outputs))
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
80 self.XtX.data[:,:]=0
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
81 self.XtY.data[:,:]=0
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
82 numpy.diag(self.XtX.data)[1:]=self.lambda.data
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
83
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
84 def updated_variables(self):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
85 self.new_XtX = self.XtX + t.dot(self.extended_input.T,self.extended_input)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
86 self.new_XtY = self.XtY + t.dot(self.extended_input.T,self.target)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
87 self.new_theta = t.solve(self.XtX,self.XtY)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
88
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
89 def minibatch_wise_inputs(self):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
90 self.input = t.matrix('input') # n_examples x n_inputs
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
91 self.target = t.matrix('target') # n_examples x n_outputs
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
92
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
93 def minibatch_wise_outputs(self):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
94 # self.input is a (n_examples, n_inputs) minibatch matrix
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
95 self.extended_input = t.prepend_one_to_each_row(self.input)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
96 self.output = t.dot(self.input,self.W.T) + self.b # (n_examples , n_outputs) matrix
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
97 self.squared_error = t.sum_within_rows(t.sqr(self.output-self.target)) # (n_examples ) vector
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
98
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
99 def attribute_names(self):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
100 return ["lambda","b","W","regularization_term","XtX","XtY"]
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
101
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
102 def default_output_fields(self, input_fields):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
103 output_fields = ["output"]
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
104 if "target" in input_fields:
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
105 output_fields.append("squared_error")
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
106 return output_fields
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
107
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
108 # poutine generale basee sur ces fonctions
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
109
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
110 def minibatchwise_use_functions(self, input_fields, output_fields):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
111 if not output_fields:
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
112 output_fields = self.default_output_fields(input_fields)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
113 key = (input_fields,output_fields)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
114 if key not in use_functions_dictionary:
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
115 use_functions_dictionary[key]=Function(self.names2attributes(input_fields),
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
116 self.names2attributes(output_fields))
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
117 return use_functions_dictionary[key]
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
118
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
119 def names2attributes(self,names,return_Result=True):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
120 if return_Result:
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
121 return [self.__getattr__(name) for name in names]
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
122 else:
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
123 return [self.__getattr__(name).data for name in names]
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
124
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
125 def use(self,input_dataset,output_fieldnames=None,test_stats_collector=None,copy_inputs=True):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
126 minibatchwise_use_function = use_functions(input_dataset.fieldNames(),output_fieldnames)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
127 virtual_output_dataset = ApplyFunctionDataSet(input_dataset,
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
128 minibatchwise_use_function,
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
129 True,DataSet.numpy_vstack,
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
130 DataSet.numpy_hstack)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
131 # actually force the computation
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
132 output_dataset = CachedDataSet(virtual_output_dataset,True)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
133 if copy_inputs:
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
134 output_dataset = input_dataset | output_dataset
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
135 # compute the attributes that should be copied in the dataset
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
136 for attribute in self.attribute_names():
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
137 # .data assumes that all attributes are Result objects
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
138 output_dataset.__setattr__(attribute) = copy.deepcopy(self.__getattr__(attribute).data)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
139 if test_stats_collector:
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
140 test_stats_collector.update(output_dataset)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
141 for attribute in test_stats_collector.attribute_names():
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
142 output_dataset[attribute] = copy.deepcopy(test_stats_collector[attribute])
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
143 return output_dataset
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
144
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
145 def update(self,training_set,train_stats_collector=None):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
146
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
147
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
148 def __init__(self,lambda=0.,max_memory_use=500):
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
149 """
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
150 @type lambda: float
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
151 @param lambda: regularization coefficient
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
152 """
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
153
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
154 W=t.matrix('W')
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
155 # b is a broadcastable row vector (can be replicated into
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
156 # as many rows as there are examples in the minibach)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
157 b=t.row('b')
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
158 minibatch_input = t.matrix('input') # n_examples x n_inputs
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
159 minibatch_target = t.matrix('target') # n_examples x n_outputs
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
160 minibatch_output = t.dot(minibatch_input,W.T) + b # n_examples x n_outputs
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
161 lambda = as_scalar(lambda)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
162 regularizer = self.lambda * t.dot(W,W)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
163 example_squared_error = t.sum_within_rows(t.sqr(minibatch_output-minibatch_target))
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
164 self.output_function = Function([W,b,minibatch_input],[minibatch_output])
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
165 self.squared_error_function = Function([minibatch_output,minibatch_target],[self.example_squared_error])
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
166 self.loss_function = Function([W,squared_error],[self.regularizer + t.sum(self.example_squared_error)])
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
167 self.W=None
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
168 self.b=None
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
169 self.XtX=None
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
170 self.XtY=None
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
171
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
172 def forget(self):
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
173 if self.W:
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
174 self.XtX *= 0
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
175 self.XtY *= 0
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
176
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
177 def use(self,input_dataset,output_fieldnames=None,copy_inputs=True):
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
178 input_fieldnames = input_dataset.fieldNames()
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
179 assert "input" in input_fieldnames
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
180 if not output_fields:
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
181 output_fields = ["output"]
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
182 if "target" in input_fieldnames:
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
183 output_fields += ["squared_error"]
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
184 else:
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
185 if "squared_error" in output_fields or "total_loss" in output_fields:
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
186 assert "target" in input_fieldnames
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
187
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
188 use_functions = []
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
189 for output_fieldname in output_fieldnames:
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
190 if output_fieldname=="output":
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
191 use_functions.append(self.output_function)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
192 elif output_fieldname=="squared_error":
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
193 use_functions.append(lambda self.output_function)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
194
77
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
195 n_examples = len(input_dataset)
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
196
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
197 for minibatch in input_dataset.minibatches(minibatch_size=minibatch_size, allow_odd_last_minibatch=True):
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
198 use_function(
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
199