Mercurial > pylearn
view doc/v2_planning/optimization.txt @ 1013:5e9a3d9bc0b4
optimization - added some text
author | James Bergstra <bergstrj@iro.umontreal.ca> |
---|---|
date | Fri, 03 Sep 2010 14:07:34 -0400 |
parents | dc5185cca21e |
children | 618b9fdbfda5 |
line wrap: on
line source
Discussion of Optimization-Related Issues ========================================= Members: JB, PL, OD Representative: JB Previous work - scikits, openopt, scipy provide function optimization algorithms. These are not currently GPU-enabled but may be in the future. IS PREVIOUS WORK SUFFICIENT? -------------------------------- In many cases it is (I used it for sparse coding, and it was ok). These packages provide batch optimization, whereas we typically need online optimization. It can be faster (to run) and more convenient (to implement) to have optimization algorithms as as Theano update expressions. What optimization algorithms to we want/need? --------------------------------------------- - sgd - sgd + momentum - sgd with annealing schedule - TONGA - James Marten's Hessian-free Do we need anything to make batch algos work better with Pylearn things? - conjugate methods? - L-BFGS?