changeset 1067:4f287324a5ad

Merged
author Olivier Delalleau <delallea@iro>
date Fri, 10 Sep 2010 10:00:49 -0400
parents e1aca94f28d8 (current diff) 2bbc464d6ed0 (diff)
children 9fe0f0755b03
files doc/v2_planning/coding_style.txt
diffstat 3 files changed, 119 insertions(+), 4 deletions(-) [+]
line wrap: on
line diff
--- /dev/null	Thu Jan 01 00:00:00 1970 +0000
+++ b/doc/v2_planning/api_optimization.txt	Fri Sep 10 10:00:49 2010 -0400
@@ -0,0 +1,98 @@
+Optimization API
+================
+
+Members: Bergstra, Lamblin, Dellaleau, Glorot, Breuleux, Bordes
+Leader: Bergstra
+
+
+Description
+-----------
+
+This API is for iterative optimization algorithms, such as:
+
+ - stochastic gradient descent (incl. momentum, annealing)
+ - delta bar delta
+ - conjugate methods
+ - L-BFGS
+ - "Hessian Free"
+ - SGD-QN
+ - TONGA
+
+The API includes an iterative interface based on Theano, and a one-shot
+interface similar to SciPy and MATLAB that is based on Python and Numpy, that
+only uses Theano for the implementation.
+
+
+Iterative Interface
+-------------------
+
+def iterative_optimizer(parameters, 
+        cost=None,
+        grads=None,
+        stop=None, 
+        updates=None,
+        **kwargs):
+    """
+    :param parameters: list or tuple of Theano variables (typically shared vars)
+        that we want to optimize iteratively.  If we're minimizing f(x), then
+        together, these variables represent 'x'.
+
+    :param cost: scalar-valued Theano variable that computes an exact or noisy estimate of
+        cost  (what are the conditions on the noise?).  Some algorithms might
+        need an exact cost, some algorithms might ignore the cost if the grads
+        are given.
+
+    :param grads: list or tuple of Theano variables representing the gradients on
+        the corresponding parameters.  These default to tensor.grad(cost,
+        parameters).
+
+    :param stop: a shared variable (scalar integer) that (if provided) will be
+        updated to say when the iterative minimization algorithm has finished
+        (1) or requires more iterations (0).
+
+    :param updates: a dictionary to update with the (var, new_value) items
+        associated with the iterative algorithm.  The default is a new empty
+        dictionary.  A KeyError is raised in case of key collisions.
+
+    :param kwargs: algorithm-dependent arguments
+
+    :returns: a dictionary mapping each parameter to an expression that it
+       should take in order to carry out the optimization procedure.
+
+       If all the parameters are shared variables, then this dictionary may be
+       passed as the ``updates`` argument to theano.function.
+
+       There may be more key,value pairs in the dictionary corresponding to
+       internal variables that are part of the optimization algorithm.
+
+    """
+
+
+One-shot Interface
+------------------
+
+def minimize(x0, f, df, opt_algo, **kwargs):
+    """
+    Return a point x_new that minimizes function `f` with derivative `df`.
+
+    This is supposed to provide an interface similar to scipy's minimize
+    routines, or MATLAB's.
+
+    :type x0: numpy ndarray
+    :param x0: starting point for minimization
+
+    :type f: python callable mapping something like x0 to a scalar
+    :param f: function to minimize
+
+    :type df: python callable mapping something like x0 to the derivative of f at that point
+    :param df: derivative of `f`
+
+    :param opt_algo: one of the functions that implements the
+    `iterative_optimizer` interface.
+
+    :param kwargs: passed through to `opt_algo`
+
+    """
+
+
+
--- a/doc/v2_planning/coding_style.txt	Fri Sep 10 09:53:50 2010 -0400
+++ b/doc/v2_planning/coding_style.txt	Fri Sep 10 10:00:49 2010 -0400
@@ -25,6 +25,7 @@
     * http://eikke.com/how-not-to-write-python-code/
     * http://jaynes.colorado.edu/PythonGuidelines.html
     * http://docs.djangoproject.com/en/dev/internals/contributing/#coding-style
+    * http://projects.scipy.org/numpy/wiki/CodingStyleGuidelines 
 
 We will probably want to take PEP-8 as starting point, and read what other
 people think about it / how other coding guidelines differ from it.
@@ -48,6 +49,10 @@
 
    * You should use two spaces after a sentence-ending period.
     --> Looks weird to me.
+    (DWF: This is an old convention from the typewriter era. It has more
+    or less been wiped out by HTML's convention of ignoring extra 
+    whitespace: see http://en.wikipedia.org/wiki/Sentence_spacing for
+    more detail. I think it's okay to drop this convention in source code.)
 
    * Imports should usually be on separate lines
     --> Can be a lot of lines wasted for no obvious benefit. I think this is
@@ -123,6 +128,11 @@
 Support 2.4 (because some of the clusters are still running 2.4) and write
 code that can be converted to 3.x with 2to3 in a straightforward way.
 Task: Write to-do's and to-not-do's to avoid compatibility issues. (OD)
+(DWF: Pauli Virtanen and others have put together extensive
+documentation in the process of porting NumPy to Py3K, see his notes at
+http://projects.scipy.org/numpy/browser/trunk/doc/Py3K.txt -- this is
+the most complete resource for complicated combinations of Python and C).
+ 
 
    * C coding style
 How to write C code (in particular for Numpy / Cuda), and how to mix C and
@@ -151,6 +161,7 @@
    * VIM / Emacs plugins / config files
 To enforce good coding style automatically.
 Task: Look for existing options. (FB)
+(DWF: I have put some time into this for vim, I will send around my files)
 
 Suggestion by PV
 ----------------
--- a/doc/v2_planning/optimization.txt	Fri Sep 10 09:53:50 2010 -0400
+++ b/doc/v2_planning/optimization.txt	Fri Sep 10 10:00:49 2010 -0400
@@ -1,9 +1,15 @@
-Discussion of Optimization-Related Issues
+=========================
+Optimization for Learning
+=========================
+
+Members: Bergstra, Lamblin, Dellaleau, Glorot, Breuleux, Bordes
+Leader: Bergstra
+
+
+
+Initial Writeup by James
 =========================================
 
-Members: JB, PL, OD, XG
-
-Representative: JB
 
 
 Previous work - scikits, openopt, scipy  provide function optimization