# HG changeset patch # User Olivier Delalleau # Date 1283538741 14400 # Node ID 0feaaa3fc566cc7ef3b87da6bddc0293f7b9562e # Parent e169a5a18aa3fb6f91325861be932c7cf803a21e# Parent 5e9a3d9bc0b41a54edb7cdf4f37a429db905ce04 Merged diff -r e169a5a18aa3 -r 0feaaa3fc566 doc/v2_planning/optimization.txt --- a/doc/v2_planning/optimization.txt Fri Sep 03 14:32:08 2010 -0400 +++ b/doc/v2_planning/optimization.txt Fri Sep 03 14:32:21 2010 -0400 @@ -1,3 +1,37 @@ Discussion of Optimization-Related Issues ========================================= +Members: JB, PL, OD + +Representative: JB + + +Previous work - scikits, openopt, scipy provide function optimization +algorithms. These are not currently GPU-enabled but may be in the future. + + +IS PREVIOUS WORK SUFFICIENT? +-------------------------------- + +In many cases it is (I used it for sparse coding, and it was ok). + +These packages provide batch optimization, whereas we typically need online +optimization. + +It can be faster (to run) and more convenient (to implement) to have +optimization algorithms as as Theano update expressions. + + +What optimization algorithms to we want/need? +--------------------------------------------- + + - sgd + - sgd + momentum + - sgd with annealing schedule + - TONGA + - James Marten's Hessian-free + +Do we need anything to make batch algos work better with Pylearn things? + - conjugate methods? + - L-BFGS? +