annotate doc/v2_planning/api_optimization.txt @ 1077:5c14d2ffcbb3

dataset: Looked into a few more existing ML libraries
author Olivier Delalleau <delallea@iro>
date Fri, 10 Sep 2010 12:48:32 -0400
parents 16ea3e5c5a7a
children 153cf820a975
rev   line source
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
1 Optimization API
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
2 ================
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
3
1069
16ea3e5c5a7a api_optimization: Couple questions
Olivier Delalleau <delallea@iro>
parents: 1065
diff changeset
4 Members: Bergstra, Lamblin, Delalleau, Glorot, Breuleux, Bordes
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
5 Leader: Bergstra
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
6
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
7
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
8 Description
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
9 -----------
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
10
1065
2bbc464d6ed0 typo in doc
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1064
diff changeset
11 This API is for iterative optimization algorithms, such as:
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
12
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
13 - stochastic gradient descent (incl. momentum, annealing)
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
14 - delta bar delta
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
15 - conjugate methods
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
16 - L-BFGS
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
17 - "Hessian Free"
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
18 - SGD-QN
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
19 - TONGA
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
20
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
21 The API includes an iterative interface based on Theano, and a one-shot
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
22 interface similar to SciPy and MATLAB that is based on Python and Numpy, that
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
23 only uses Theano for the implementation.
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
24
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
25
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
26 Iterative Interface
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
27 -------------------
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
28
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
29 def iterative_optimizer(parameters,
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
30 cost=None,
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
31 grads=None,
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
32 stop=None,
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
33 updates=None,
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
34 **kwargs):
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
35 """
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
36 :param parameters: list or tuple of Theano variables (typically shared vars)
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
37 that we want to optimize iteratively. If we're minimizing f(x), then
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
38 together, these variables represent 'x'.
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
39
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
40 :param cost: scalar-valued Theano variable that computes an exact or noisy estimate of
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
41 cost (what are the conditions on the noise?). Some algorithms might
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
42 need an exact cost, some algorithms might ignore the cost if the grads
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
43 are given.
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
44
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
45 :param grads: list or tuple of Theano variables representing the gradients on
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
46 the corresponding parameters. These default to tensor.grad(cost,
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
47 parameters).
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
48
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
49 :param stop: a shared variable (scalar integer) that (if provided) will be
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
50 updated to say when the iterative minimization algorithm has finished
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
51 (1) or requires more iterations (0).
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
52
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
53 :param updates: a dictionary to update with the (var, new_value) items
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
54 associated with the iterative algorithm. The default is a new empty
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
55 dictionary. A KeyError is raised in case of key collisions.
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
56
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
57 :param kwargs: algorithm-dependent arguments
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
58
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
59 :returns: a dictionary mapping each parameter to an expression that it
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
60 should take in order to carry out the optimization procedure.
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
61
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
62 If all the parameters are shared variables, then this dictionary may be
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
63 passed as the ``updates`` argument to theano.function.
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
64
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
65 There may be more key,value pairs in the dictionary corresponding to
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
66 internal variables that are part of the optimization algorithm.
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
67
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
68 """
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
69
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
70
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
71 One-shot Interface
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
72 ------------------
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
73
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
74 def minimize(x0, f, df, opt_algo, **kwargs):
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
75 """
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
76 Return a point x_new that minimizes function `f` with derivative `df`.
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
77
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
78 This is supposed to provide an interface similar to scipy's minimize
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
79 routines, or MATLAB's.
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
80
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
81 :type x0: numpy ndarray
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
82 :param x0: starting point for minimization
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
83
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
84 :type f: python callable mapping something like x0 to a scalar
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
85 :param f: function to minimize
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
86
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
87 :type df: python callable mapping something like x0 to the derivative of f at that point
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
88 :param df: derivative of `f`
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
89
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
90 :param opt_algo: one of the functions that implements the
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
91 `iterative_optimizer` interface.
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
92
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
93 :param kwargs: passed through to `opt_algo`
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
94
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
95 """
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
96
1069
16ea3e5c5a7a api_optimization: Couple questions
Olivier Delalleau <delallea@iro>
parents: 1065
diff changeset
97 OD: Could it be more convenient for x0 to be a list?
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
98
1069
16ea3e5c5a7a api_optimization: Couple questions
Olivier Delalleau <delallea@iro>
parents: 1065
diff changeset
99 OD: Why make a difference between iterative and one-shot versions? A one-shot
16ea3e5c5a7a api_optimization: Couple questions
Olivier Delalleau <delallea@iro>
parents: 1065
diff changeset
100 algorithm can be seen as an iterative one that stops after its first
16ea3e5c5a7a api_optimization: Couple questions
Olivier Delalleau <delallea@iro>
parents: 1065
diff changeset
101 iteration. The difference I see between the two interfaces proposed here
16ea3e5c5a7a api_optimization: Couple questions
Olivier Delalleau <delallea@iro>
parents: 1065
diff changeset
102 is mostly that one relies on Theano while the other one does not, but
16ea3e5c5a7a api_optimization: Couple questions
Olivier Delalleau <delallea@iro>
parents: 1065
diff changeset
103 hopefully a non-Theano one can be created by simply wrapping around the
16ea3e5c5a7a api_optimization: Couple questions
Olivier Delalleau <delallea@iro>
parents: 1065
diff changeset
104 Theano one.
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
105