# HG changeset patch # User Olivier Delalleau # Date 1284128955 14400 # Node ID 2cf3ad953bf97094bbe1568c7428ce497513d9c4 # Parent 79eb0016f333406f6fa8b5da252932278fb3b790 optimization: Removed duplicated API draft and asked question diff -r 79eb0016f333 -r 2cf3ad953bf9 doc/v2_planning/optimization.txt --- a/doc/v2_planning/optimization.txt Fri Sep 10 10:28:51 2010 -0400 +++ b/doc/v2_planning/optimization.txt Fri Sep 10 10:29:15 2010 -0400 @@ -49,42 +49,7 @@ Proposal for API ================ -Stick to the same style of API that we've used for SGD so far. I think it has -worked well. It takes theano expressions as inputs and returns theano -expressions as results. The caller is responsible for building those -expressions into a callable function that does the minimization (and other -things too maybe). - - -def stochastic_gradientbased_optimization_updates(parameters, cost=None, grads=None, **kwargs): - """ - :param parameters: list or tuple of Theano variables (typically shared vars) - that we want to optimize iteratively algorithm. - - :param cost: scalar-valued Theano variable that computes noisy estimate of - cost (what are the conditions on the noise?). The cost is ignored if - grads are given. - - :param grads: list or tuple of Theano variables representing the gradients on - the corresponding parameters. These default to tensor.grad(cost, - parameters). +See api_optimization.txt. - :param kwargs: algorithm-dependent arguments - - :returns: a list of pairs (v, new_v) that indicate the value (new_v) each - variable (v) should take in order to carry out the optimization procedure. - - The first section of the return value list corresponds to the terms in - `parameters`, and the optimization algorithm can return additional update - expression afterward. This list of pairs can be passed directly to the - dict() constructor to create a dictionary such that dct[v] == new_v. - """ - - -Why not a class interface with an __init__ that takes the kwargs, and an -updates() that returns the updates? It would be wrong for auxiliary shared -variables to be involved in two updates, so the interface should not encourage -separate methods for those two steps. - - - +OD: Do we really need a different file? If yes, maybe create a subdirectory to + be able to easily find all files related to optimization?