# HG changeset patch # User James Bergstra # Date 1283537254 14400 # Node ID 5e9a3d9bc0b41a54edb7cdf4f37a429db905ce04 # Parent 5d7022325d8e7389067274c03e5ebbd5ecbb5aa0 optimization - added some text diff -r 5d7022325d8e -r 5e9a3d9bc0b4 doc/v2_planning/optimization.txt --- a/doc/v2_planning/optimization.txt Fri Sep 03 13:55:01 2010 -0400 +++ b/doc/v2_planning/optimization.txt Fri Sep 03 14:07:34 2010 -0400 @@ -1,3 +1,37 @@ Discussion of Optimization-Related Issues ========================================= +Members: JB, PL, OD + +Representative: JB + + +Previous work - scikits, openopt, scipy provide function optimization +algorithms. These are not currently GPU-enabled but may be in the future. + + +IS PREVIOUS WORK SUFFICIENT? +-------------------------------- + +In many cases it is (I used it for sparse coding, and it was ok). + +These packages provide batch optimization, whereas we typically need online +optimization. + +It can be faster (to run) and more convenient (to implement) to have +optimization algorithms as as Theano update expressions. + + +What optimization algorithms to we want/need? +--------------------------------------------- + + - sgd + - sgd + momentum + - sgd with annealing schedule + - TONGA + - James Marten's Hessian-free + +Do we need anything to make batch algos work better with Pylearn things? + - conjugate methods? + - L-BFGS? +