Mercurial > pylearn
annotate doc/v2_planning/optimization.txt @ 1057:baf1988db557
v2planning optimization - added API
author | James Bergstra <bergstrj@iro.umontreal.ca> |
---|---|
date | Thu, 09 Sep 2010 11:32:42 -0400 |
parents | 89e76e6e074f |
children | a41cc29cee26 |
rev | line source |
---|---|
1009
dc5185cca21e
Added files for Coding Style and Optimization committees
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
1 Discussion of Optimization-Related Issues |
dc5185cca21e
Added files for Coding Style and Optimization committees
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
2 ========================================= |
dc5185cca21e
Added files for Coding Style and Optimization committees
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
3 |
1036
89e76e6e074f
XG added to optimization team
Xavier Glorot <glorotxa@iro.umontreal.ca>
parents:
1027
diff
changeset
|
4 Members: JB, PL, OD, XG |
1013
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
5 |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
6 Representative: JB |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
7 |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
8 |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
9 Previous work - scikits, openopt, scipy provide function optimization |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
10 algorithms. These are not currently GPU-enabled but may be in the future. |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
11 |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
12 |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
13 IS PREVIOUS WORK SUFFICIENT? |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
14 -------------------------------- |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
15 |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
16 In many cases it is (I used it for sparse coding, and it was ok). |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
17 |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
18 These packages provide batch optimization, whereas we typically need online |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
19 optimization. |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
20 |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
21 It can be faster (to run) and more convenient (to implement) to have |
1016
618b9fdbfda5
optimization: Minor typo fixes
Olivier Delalleau <delallea@iro>
parents:
1013
diff
changeset
|
22 optimization algorithms as Theano update expressions. |
1013
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
23 |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
24 |
1016
618b9fdbfda5
optimization: Minor typo fixes
Olivier Delalleau <delallea@iro>
parents:
1013
diff
changeset
|
25 What optimization algorithms do we want/need? |
1013
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
26 --------------------------------------------- |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
27 |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
28 - sgd |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
29 - sgd + momentum |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
30 - sgd with annealing schedule |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
31 - TONGA |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
32 - James Marten's Hessian-free |
1027
a1b6ccd5b6dc
few comments added
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents:
1016
diff
changeset
|
33 - Conjugate gradients, batch and (large) mini-batch [that is also what Marten's thing does] |
1013
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
34 |
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
35 Do we need anything to make batch algos work better with Pylearn things? |
1027
a1b6ccd5b6dc
few comments added
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents:
1016
diff
changeset
|
36 - conjugate methods? yes |
a1b6ccd5b6dc
few comments added
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents:
1016
diff
changeset
|
37 - L-BFGS? maybe, when needed |
1013
5e9a3d9bc0b4
optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1009
diff
changeset
|
38 |
1027
a1b6ccd5b6dc
few comments added
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents:
1016
diff
changeset
|
39 |
a1b6ccd5b6dc
few comments added
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents:
1016
diff
changeset
|
40 |
a1b6ccd5b6dc
few comments added
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents:
1016
diff
changeset
|
41 |
1057
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
42 |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
43 Proposal for API |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
44 ================ |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
45 |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
46 Stick to the same style of API that we've used for SGD so far. I think it has |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
47 worked well. It takes theano expressions as inputs and returns theano |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
48 expressions as results. The caller is responsible for building those |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
49 expressions into a callable function that does the minimization (and other |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
50 things too maybe). |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
51 |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
52 |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
53 def stochastic_gradientbased_optimization_updates(parameters, cost=None, grads=None, **kwargs): |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
54 """ |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
55 :param parameters: list or tuple of Theano variables (typically shared vars) |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
56 that we want to optimize iteratively algorithm. |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
57 |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
58 :param cost: scalar-valued Theano variable that computes noisy estimate of |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
59 cost (what are the conditions on the noise?). The cost is ignored if |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
60 grads are given. |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
61 |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
62 :param grads: list or tuple of Theano variables representing the gradients on |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
63 the corresponding parameters. These default to tensor.grad(cost, |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
64 parameters). |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
65 |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
66 :param kwargs: algorithm-dependent arguments |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
67 |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
68 :returns: a list of pairs (v, new_v) that indicate the value (new_v) each |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
69 variable (v) should take in order to carry out the optimization procedure. |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
70 |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
71 The first section of the return value list corresponds to the terms in |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
72 `parameters`, and the optimization algorithm can return additional update |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
73 expression afterward. This list of pairs can be passed directly to the |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
74 dict() constructor to create a dictionary such that dct[v] == new_v. |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
75 """ |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
76 |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
77 |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
78 Why not a class interface with an __init__ that takes the kwargs, and an |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
79 updates() that returns the updates? It would be wrong for auxiliary shared |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
80 variables to be involved in two updates, so the interface should not encourage |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
81 separate methods for those two steps. |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
82 |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
83 |
baf1988db557
v2planning optimization - added API
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1036
diff
changeset
|
84 |