Mercurial > pylearn
comparison sandbox/weights.py @ 466:23221eefb70e
Added pylearn.sandbox.weights.random_weights
author | Joseph Turian <turian@iro.umontreal.ca> |
---|---|
date | Wed, 15 Oct 2008 18:59:55 -0400 |
parents | |
children | 3daabc7f94ff |
comparison
equal
deleted
inserted
replaced
465:8cde974b6486 | 466:23221eefb70e |
---|---|
1 """ | |
2 Routine to initialize weights. | |
3 | |
4 @note: We assume that numpy.random.seed() has already been performed. | |
5 """ | |
6 | |
7 from math import sqrt | |
8 import numpy.random | |
9 def random_weights(nin, nout, scale_by=sqrt(3)): | |
10 """ | |
11 Generate an initial weight matrix with nin inputs (rows) and nout | |
12 outputs (cols). | |
13 Each weight is chosen uniformly at random to be in range: | |
14 [-scale_by/sqrt(nin), +scale_by/sqrt(nin)] | |
15 @note: Play with scale_by! | |
16 Ronan derives scale_by=sqrt(3) because that gives variance of | |
17 1 to something (I forget, ask Yoshua for the derivation). However, | |
18 Ronan got better results by accidentally using scale_by=1. Yoshua | |
19 hypothesizes this is because the variance will get telescopically | |
20 smaller as we go up the layers [need more explanation of this | |
21 argument]. | |
22 @note: Things may get even trickier if the same weights are being | |
23 shared in multiple places. | |
24 """ | |
25 return (numpy.random.rand(nin, nout) * 2.0 - 1) * scale_by / sqrt(nin) |