annotate doc/v2_planning/optimization.txt @ 1037:88b296cfba50

v2 learner api - added todo note
author James Bergstra <bergstrj@iro.umontreal.ca>
date Tue, 07 Sep 2010 12:25:10 -0400
parents 89e76e6e074f
children baf1988db557
rev   line source
1009
dc5185cca21e Added files for Coding Style and Optimization committees
Olivier Delalleau <delallea@iro>
parents:
diff changeset
1 Discussion of Optimization-Related Issues
dc5185cca21e Added files for Coding Style and Optimization committees
Olivier Delalleau <delallea@iro>
parents:
diff changeset
2 =========================================
dc5185cca21e Added files for Coding Style and Optimization committees
Olivier Delalleau <delallea@iro>
parents:
diff changeset
3
1036
89e76e6e074f XG added to optimization team
Xavier Glorot <glorotxa@iro.umontreal.ca>
parents: 1027
diff changeset
4 Members: JB, PL, OD, XG
1013
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
5
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
6 Representative: JB
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
7
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
8
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
9 Previous work - scikits, openopt, scipy provide function optimization
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
10 algorithms. These are not currently GPU-enabled but may be in the future.
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
11
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
12
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
13 IS PREVIOUS WORK SUFFICIENT?
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
14 --------------------------------
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
15
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
16 In many cases it is (I used it for sparse coding, and it was ok).
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
17
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
18 These packages provide batch optimization, whereas we typically need online
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
19 optimization.
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
20
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
21 It can be faster (to run) and more convenient (to implement) to have
1016
618b9fdbfda5 optimization: Minor typo fixes
Olivier Delalleau <delallea@iro>
parents: 1013
diff changeset
22 optimization algorithms as Theano update expressions.
1013
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
23
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
24
1016
618b9fdbfda5 optimization: Minor typo fixes
Olivier Delalleau <delallea@iro>
parents: 1013
diff changeset
25 What optimization algorithms do we want/need?
1013
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
26 ---------------------------------------------
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
27
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
28 - sgd
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
29 - sgd + momentum
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
30 - sgd with annealing schedule
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
31 - TONGA
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
32 - James Marten's Hessian-free
1027
a1b6ccd5b6dc few comments added
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 1016
diff changeset
33 - Conjugate gradients, batch and (large) mini-batch [that is also what Marten's thing does]
1013
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
34
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
35 Do we need anything to make batch algos work better with Pylearn things?
1027
a1b6ccd5b6dc few comments added
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 1016
diff changeset
36 - conjugate methods? yes
a1b6ccd5b6dc few comments added
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 1016
diff changeset
37 - L-BFGS? maybe, when needed
1013
5e9a3d9bc0b4 optimization - added some text
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1009
diff changeset
38
1027
a1b6ccd5b6dc few comments added
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 1016
diff changeset
39
a1b6ccd5b6dc few comments added
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 1016
diff changeset
40
a1b6ccd5b6dc few comments added
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 1016
diff changeset
41