comparison pylearn/shared/layers/sigmoidal_layer.py @ 1405:f9e4d71aa353

Add L1 and L2² costs to sigmoidal layer
author Pascal Lamblin <lamblinp@iro.umontreal.ca>
date Wed, 26 Jan 2011 16:55:44 -0500
parents 912be602c3ac
children
comparison
equal deleted inserted replaced
1404:89017617ab36 1405:f9e4d71aa353
18 :param w: a symbolic weight matrix of shape (n_in, n_out) 18 :param w: a symbolic weight matrix of shape (n_in, n_out)
19 :param b: symbolic bias terms of shape (n_out,) 19 :param b: symbolic bias terms of shape (n_out,)
20 :param squash: an squashing function 20 :param squash: an squashing function
21 """ 21 """
22 output = squash_fn(tensor.dot(input, w) + b) 22 output = squash_fn(tensor.dot(input, w) + b)
23 l1 = abs(w).sum()
24 l2_sqr = (w**2).sum()
23 update_locals(self, locals()) 25 update_locals(self, locals())
24 26
25 @classmethod 27 @classmethod
26 def new(cls, rng, input, n_in, n_out, squash_fn=tensor.tanh, dtype=None): 28 def new(cls, rng, input, n_in, n_out, squash_fn=tensor.tanh, dtype=None):
27 """Allocate a SigmoidLayer with weights to transform inputs with n_in dimensions, 29 """Allocate a SigmoidLayer with weights to transform inputs with n_in dimensions,