Mercurial > pylearn
diff doc/v2_planning/optimization.txt @ 1015:0feaaa3fc566
Merged
author | Olivier Delalleau <delallea@iro> |
---|---|
date | Fri, 03 Sep 2010 14:32:21 -0400 |
parents | 5e9a3d9bc0b4 |
children | 618b9fdbfda5 |
line wrap: on
line diff
--- a/doc/v2_planning/optimization.txt Fri Sep 03 14:32:08 2010 -0400 +++ b/doc/v2_planning/optimization.txt Fri Sep 03 14:32:21 2010 -0400 @@ -1,3 +1,37 @@ Discussion of Optimization-Related Issues ========================================= +Members: JB, PL, OD + +Representative: JB + + +Previous work - scikits, openopt, scipy provide function optimization +algorithms. These are not currently GPU-enabled but may be in the future. + + +IS PREVIOUS WORK SUFFICIENT? +-------------------------------- + +In many cases it is (I used it for sparse coding, and it was ok). + +These packages provide batch optimization, whereas we typically need online +optimization. + +It can be faster (to run) and more convenient (to implement) to have +optimization algorithms as as Theano update expressions. + + +What optimization algorithms to we want/need? +--------------------------------------------- + + - sgd + - sgd + momentum + - sgd with annealing schedule + - TONGA + - James Marten's Hessian-free + +Do we need anything to make batch algos work better with Pylearn things? + - conjugate methods? + - L-BFGS? +