Mercurial > pylearn
changeset 779:2c159439c47c
Sigmoid before logistic regression to avoid small gradient with tanh in StackedDAAig
author | Xavier Glorot <glorotxa@iro.umontreal.ca> |
---|---|
date | Sun, 21 Jun 2009 20:16:23 -0400 |
parents | a985baadf74d |
children | e768674aa51f |
files | pylearn/algorithms/sandbox/DAA_inputs_groups.py |
diffstat | 1 files changed, 1 insertions(+), 1 deletions(-) [+] |
line wrap: on
line diff
--- a/pylearn/algorithms/sandbox/DAA_inputs_groups.py Sat Jun 13 22:02:13 2009 -0400 +++ b/pylearn/algorithms/sandbox/DAA_inputs_groups.py Sun Jun 21 20:16:23 2009 -0400 @@ -578,7 +578,7 @@ # supervised layer print '\tLayer supervised init' self.inputs[-1] = copy.copy(self.inputs[-2])+[self.target] - self.daaig[-1] = LogRegN(in_sizeprec,self.n_out,inputprec,self.target) + self.daaig[-1] = LogRegN(in_sizeprec,self.n_out,sigmoid_act(self.daaig[-2].clean.hidden_activation),self.target) paramstot += self.daaig[-1].params if self.regularize: