Mercurial > pylearn
changeset 741:4617ee277698
fix a bad declaration of the updates in StackedDAAig
author | Xavier Glorot <glorotxa@iro.umontreal.ca> |
---|---|
date | Fri, 29 May 2009 10:48:20 -0400 |
parents | 7ab4bc96cb12 |
children | 5aa4cf193197 b4aa46f856c1 |
files | pylearn/algorithms/sandbox/DAA_inputs_groups.py |
diffstat | 1 files changed, 2 insertions(+), 2 deletions(-) [+] |
line wrap: on
line diff
--- a/pylearn/algorithms/sandbox/DAA_inputs_groups.py Thu May 28 12:12:34 2009 -0400 +++ b/pylearn/algorithms/sandbox/DAA_inputs_groups.py Fri May 29 10:48:20 2009 -0400 @@ -529,8 +529,8 @@ self.totalgradients[-1] = [T.grad(self.totalcost[-2], paramstot) ,\ T.grad(self.globalcost[-1], paramstot) ] - local_grads = dict((j, j - self.unsup_lr * g) for j,g in zip(self.daaig[-1].params,self.localgradients[-1])) - global_grads = dict((j, j - self.unsup_lr * g)\ + local_grads = dict((j, j - self.sup_lr * g) for j,g in zip(self.daaig[-1].params,self.localgradients[-1])) + global_grads = dict((j, j - self.sup_lr * g)\ for j,g in zip(self.daaig[-1].params+paramsenc,self.globalgradients[-1])) if self.totalupdatebool: total_grads = dict((j, j - self.unsup_lr * g1 - self.sup_lr * g2)\