Mercurial > pylearn
comparison doc/v2_planning/optimization.txt @ 1013:5e9a3d9bc0b4
optimization - added some text
author | James Bergstra <bergstrj@iro.umontreal.ca> |
---|---|
date | Fri, 03 Sep 2010 14:07:34 -0400 |
parents | dc5185cca21e |
children | 618b9fdbfda5 |
comparison
equal
deleted
inserted
replaced
1012:5d7022325d8e | 1013:5e9a3d9bc0b4 |
---|---|
1 Discussion of Optimization-Related Issues | 1 Discussion of Optimization-Related Issues |
2 ========================================= | 2 ========================================= |
3 | 3 |
4 Members: JB, PL, OD | |
5 | |
6 Representative: JB | |
7 | |
8 | |
9 Previous work - scikits, openopt, scipy provide function optimization | |
10 algorithms. These are not currently GPU-enabled but may be in the future. | |
11 | |
12 | |
13 IS PREVIOUS WORK SUFFICIENT? | |
14 -------------------------------- | |
15 | |
16 In many cases it is (I used it for sparse coding, and it was ok). | |
17 | |
18 These packages provide batch optimization, whereas we typically need online | |
19 optimization. | |
20 | |
21 It can be faster (to run) and more convenient (to implement) to have | |
22 optimization algorithms as as Theano update expressions. | |
23 | |
24 | |
25 What optimization algorithms to we want/need? | |
26 --------------------------------------------- | |
27 | |
28 - sgd | |
29 - sgd + momentum | |
30 - sgd with annealing schedule | |
31 - TONGA | |
32 - James Marten's Hessian-free | |
33 | |
34 Do we need anything to make batch algos work better with Pylearn things? | |
35 - conjugate methods? | |
36 - L-BFGS? | |
37 |