annotate sandbox/sparse_random_autoassociator/main.py @ 416:8849eba55520

Can now do minibatch update
author Joseph Turian <turian@iro.umontreal.ca>
date Fri, 11 Jul 2008 16:34:46 -0400
parents 36baeb7125a4
children
rev   line source
370
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
1 #!/usr/bin/python
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
2 """
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
3 An autoassociator for sparse inputs, using Ronan Collobert + Jason
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
4 Weston's sampling trick (2008).
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
5
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
6 The learned model is::
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
7 h = sigmoid(dot(x, w1) + b1)
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
8 y = sigmoid(dot(h, w2) + b2)
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
9
371
22463a194c90 Update doc
Joseph Turian <turian@gmail.com>
parents: 370
diff changeset
10 We assume that most of the inputs are zero, and hence that
22463a194c90 Update doc
Joseph Turian <turian@gmail.com>
parents: 370
diff changeset
11 we can separate x into xnonzero, x's nonzero components, and
22463a194c90 Update doc
Joseph Turian <turian@gmail.com>
parents: 370
diff changeset
12 xzero, a sample of the zeros. We sample---randomly without
22463a194c90 Update doc
Joseph Turian <turian@gmail.com>
parents: 370
diff changeset
13 replacement---ZERO_SAMPLE_SIZE zero columns from x.
370
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
14
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
15 The desideratum is that every nonzero entry is separated from every
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
16 zero entry by margin at least MARGIN.
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
17 For each ynonzero, we want it to exceed max(yzero) by at least MARGIN.
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
18 For each yzero, we want it to be exceed by min(ynonzero) by at least MARGIN.
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
19 The loss is a hinge loss (linear). The loss is irrespective of the
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
20 xnonzero magnitude (this may be a limitation). Hence, all nonzeroes
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
21 are equally important to exceed the maximum yzero.
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
22
381
e4473d9697d7 Added xent loss
Joseph Turian <turian@gmail.com>
parents: 372
diff changeset
23 (Alternately, there is a commented out binary xent loss.)
e4473d9697d7 Added xent loss
Joseph Turian <turian@gmail.com>
parents: 372
diff changeset
24
370
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
25 LIMITATIONS:
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
26 - Only does pure stochastic gradient (batchsize = 1).
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
27 - Loss is irrespective of the xnonzero magnitude.
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
28 - We will always use all nonzero entries, even if the training
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
29 instance is very non-sparse.
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
30 """
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
31
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
32
372
75bab24bb2d8 Moved more logic into model.py
Joseph Turian <turian@gmail.com>
parents: 371
diff changeset
33 import numpy
370
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
34
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
35 nonzero_instances = []
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
36 nonzero_instances.append({1: 0.1, 5: 0.5, 9: 1})
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
37 nonzero_instances.append({2: 0.3, 5: 0.5, 8: 0.8})
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
38 nonzero_instances.append({1: 0.2, 2: 0.3, 5: 0.5})
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
39
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
40 import model
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
41 model = model.Model()
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
42
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
43 for i in xrange(100000):
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
44 # Select an instance
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
45 instance = nonzero_instances[i % len(nonzero_instances)]
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
46
a1bbcde6b456 Moved sparse_random_autoassociator from my repository
Joseph Turian <turian@gmail.com>
parents:
diff changeset
47 # SGD update over instance
372
75bab24bb2d8 Moved more logic into model.py
Joseph Turian <turian@gmail.com>
parents: 371
diff changeset
48 model.update(instance)