changeset 1071:2cf3ad953bf9

optimization: Removed duplicated API draft and asked question
author Olivier Delalleau <delallea@iro>
date Fri, 10 Sep 2010 10:29:15 -0400
parents 79eb0016f333
children 04bbf05d249c 3e7978201ffc
files doc/v2_planning/optimization.txt
diffstat 1 files changed, 3 insertions(+), 38 deletions(-) [+]
line wrap: on
line diff
--- a/doc/v2_planning/optimization.txt	Fri Sep 10 10:28:51 2010 -0400
+++ b/doc/v2_planning/optimization.txt	Fri Sep 10 10:29:15 2010 -0400
@@ -49,42 +49,7 @@
 Proposal for API
 ================
 
-Stick to the same style of API that we've used for SGD so far.  I think it has
-worked well.  It takes theano expressions as inputs and returns theano
-expressions as results.  The caller is responsible for building those
-expressions into a callable function that does the minimization (and other
-things too maybe).
-
-
-def stochastic_gradientbased_optimization_updates(parameters, cost=None, grads=None, **kwargs):
-   """
-   :param parameters: list or tuple of Theano variables (typically shared vars)
-       that we want to optimize iteratively algorithm.
-
-   :param cost: scalar-valued Theano variable that computes noisy estimate of
-       cost  (what are the conditions on the noise?).  The cost is ignored if
-       grads are given.
-
-   :param grads: list or tuple of Theano variables representing the gradients on
-       the corresponding parameters.  These default to tensor.grad(cost,
-       parameters).
+See api_optimization.txt.
 
-   :param kwargs: algorithm-dependent arguments
-
-   :returns: a list of pairs (v, new_v) that indicate the value (new_v) each
-      variable (v) should take in order to carry out the optimization procedure.
-
-      The first section of the return value list corresponds to the terms in
-      `parameters`, and the optimization algorithm can return additional update
-      expression afterward.  This list of pairs can be passed directly to the
-      dict() constructor to create a dictionary such that dct[v] == new_v.
-   """
-
-
-Why not a class interface with an __init__ that takes the kwargs, and an
-updates() that returns the updates?  It would be wrong for auxiliary shared
-variables to be involved in two updates, so the interface should not encourage
-separate methods for those two steps.
-
-
-
+OD: Do we really need a different file? If yes, maybe create a subdirectory to
+    be able to easily find all files related to optimization?