Mercurial > pylearn
diff linear_regression.py @ 421:e01f17be270a
Kernel regression learning algorithm
author | Yoshua Bengio <bengioy@iro.umontreal.ca> |
---|---|
date | Sat, 19 Jul 2008 10:11:22 -0400 |
parents | 43d9aa93934e |
children | fa4a5fee53ce |
line wrap: on
line diff
--- a/linear_regression.py Tue Jul 15 12:57:21 2008 -0400 +++ b/linear_regression.py Sat Jul 19 10:11:22 2008 -0400 @@ -34,12 +34,15 @@ we want to compute the squared errors. The predictor parameters are obtained analytically from the training set. + + *** NOT IMPLEMENTED YET *** Training can proceed sequentially (with multiple calls to update with different disjoint subsets of the training sets). After each call to update the predictor is ready to be used (and optimized for the union of all the training sets passed to update since construction or since the last call to forget). - + *************************** + For each (input[t],output[t]) pair in a minibatch,:: output_t = b + W * input_t @@ -74,7 +77,7 @@ def __init__(self, L2_regularizer=0,minibatch_size=10000): self.L2_regularizer=L2_regularizer self.equations = LinearRegressionEquations() - self.minibatch_size=1000 + self.minibatch_size=minibatch_size def __call__(self,trainset): first_example = trainset[0]