annotate doc/v2_planning/API_optimization.txt @ 1210:cbe1fb32686c

v2planning plugin_JB - added n_required keyword to WEAVE
author James Bergstra <bergstrj@iro.umontreal.ca>
date Tue, 21 Sep 2010 23:38:53 -0400
parents 203569655069
children 4d7fdd04b66a
rev   line source
1185
4ea46ef9822a small fix to make the API_optimization show on the web.
Frederic Bastien <nouiz@nouiz.org>
parents: 1182
diff changeset
1 .. _v2planning_optimization:
4ea46ef9822a small fix to make the API_optimization show on the web.
Frederic Bastien <nouiz@nouiz.org>
parents: 1182
diff changeset
2
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
3 Optimization API
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
4 ================
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
5
1069
16ea3e5c5a7a api_optimization: Couple questions
Olivier Delalleau <delallea@iro>
parents: 1065
diff changeset
6 Members: Bergstra, Lamblin, Delalleau, Glorot, Breuleux, Bordes
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
7 Leader: Bergstra
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
8
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
9
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
10 Description
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
11 -----------
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
12
1065
2bbc464d6ed0 typo in doc
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1064
diff changeset
13 This API is for iterative optimization algorithms, such as:
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
14
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
15 - stochastic gradient descent (incl. momentum, annealing)
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
16 - delta bar delta
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
17 - conjugate methods
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
18 - L-BFGS
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
19 - "Hessian Free"
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
20 - SGD-QN
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
21 - TONGA
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
22
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
23 The API includes an iterative interface based on Theano, and a one-shot
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
24 interface similar to SciPy and MATLAB that is based on Python and Numpy, that
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
25 only uses Theano for the implementation.
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
26
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
27
1100
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
28 Theano Interface
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
29 -----------------
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
30
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
31 The theano interface to optimization algorithms is to ask for a dictionary of
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
32 updates that can be used in theano.function. Implementations of iterative
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
33 optimization algorithms should be global functions with a signature like
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
34 'iterative_optimizer'.
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
35
1188
073c2fab7bcd fix rst syntax.
Frederic Bastien <nouiz@nouiz.org>
parents: 1185
diff changeset
36 .. code-block:: python
073c2fab7bcd fix rst syntax.
Frederic Bastien <nouiz@nouiz.org>
parents: 1185
diff changeset
37
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
38 def iterative_optimizer(parameters,
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
39 cost=None,
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
40 gradients=None,
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
41 stop=None,
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
42 updates=None,
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
43 **kwargs):
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
44 """
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
45 :param parameters: list or tuple of Theano variables
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
46 that we want to optimize iteratively. If we're minimizing f(x), then
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
47 together, these variables represent 'x'. Typically these are shared
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
48 variables and their values are the initial values for the minimization
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
49 algorithm.
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
50
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
51 :param cost: scalar-valued Theano variable that computes an exact or noisy estimate of
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
52 cost (what are the conditions on the noise?). Some algorithms might
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
53 need an exact cost, some algorithms might ignore the cost if the
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
54 gradients are given.
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
55
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
56 :param gradients: list or tuple of Theano variables representing the gradients on
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
57 the corresponding parameters. These default to tensor.grad(cost,
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
58 parameters).
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
59
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
60 :param stop: a shared variable (scalar integer) that (if provided) will be
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
61 updated to say when the iterative minimization algorithm has finished
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
62 (1) or requires more iterations (0).
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
63
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
64 :param updates: a dictionary to update with the (var, new_value) items
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
65 associated with the iterative algorithm. The default is a new empty
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
66 dictionary. A KeyError is raised in case of key collisions.
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
67
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
68 :param kwargs: algorithm-dependent arguments
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
69
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
70 :returns: a dictionary mapping each parameter to an expression that it
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
71 should take in order to carry out the optimization procedure.
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
72
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
73 If all the parameters are shared variables, then this dictionary may be
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
74 passed as the ``updates`` argument to theano.function.
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
75
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
76 There may be more key,value pairs in the dictionary corresponding to
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
77 internal variables that are part of the optimization algorithm.
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
78
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
79 """
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
80
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
81
1100
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
82 Numpy Interface
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
83 ---------------
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
84
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
85 The numpy interface to optimization algorithms is supposed to mimick
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
86 scipy's. Its arguments are numpy arrays, and functions that manipulate numpy
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
87 arrays.
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
88
1188
073c2fab7bcd fix rst syntax.
Frederic Bastien <nouiz@nouiz.org>
parents: 1185
diff changeset
89 .. code-block:: python
073c2fab7bcd fix rst syntax.
Frederic Bastien <nouiz@nouiz.org>
parents: 1185
diff changeset
90
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
91 def minimize(x0, f, df, opt_algo, **kwargs):
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
92 """
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
93 Return a point x_new with the same type as x0 that minimizes function `f`
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
94 with derivative `df`.
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
95
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
96 This is supposed to provide an interface similar to scipy's minimize
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
97 routines, or MATLAB's.
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
98
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
99 :type x0: numpy ndarray or list of numpy ndarrays.
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
100 :param x0: starting point for minimization
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
101
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
102 :type f: python callable mapping something like x0 to a scalar
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
103 :param f: function to minimize
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
104
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
105 :type df: python callable mapping something like x0 to the derivative of f at that point
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
106 :param df: derivative of `f`
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
107
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
108 :param opt_algo: one of the functions that implements the
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
109 `iterative_optimizer` interface.
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
110
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
111 :param kwargs: passed through to `opt_algo`
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
112
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
113 """
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
114
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
115
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
116 There is also a numpy-based wrapper to the iterative algorithms.
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
117 This can be more useful than minimize() because it doesn't hog program
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
118 control. Technically minimize() is probably implemented using this
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
119 minimize_iterator interface.
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
120
1188
073c2fab7bcd fix rst syntax.
Frederic Bastien <nouiz@nouiz.org>
parents: 1185
diff changeset
121 .. code-block:: python
073c2fab7bcd fix rst syntax.
Frederic Bastien <nouiz@nouiz.org>
parents: 1185
diff changeset
122
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
123 class minimize_iterator(object):
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
124 """
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
125 Attributes
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
126 - x - the current best estimate of the minimum
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
127 - f - the function being minimized
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
128 - df - f's derivative function
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
129 - opt_algo - the optimization algorithm at work (a serializable, callable
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
130 object with the signature of iterative_optimizer above).
1100
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
131
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
132 """
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
133 def __init__(self, x0, f, df, opt_algo, **kwargs):
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
134 """Initialize state (arguments as in minimize())
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
135 """
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
136 def __iter__(self):
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
137 return self
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
138 def next(self):
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
139 """Take a step of minimization and return self raises StopIteration when
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
140 the algorithm is finished with minimization
1064
a41cc29cee26 v2planning optimization - API draft
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
diff changeset
141
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
142 """
1100
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
143
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
144
1193
886db9dad2f9 api_optimization - added reply to OD
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1188
diff changeset
145 JB replies: I don't think so, but for the few lines of code required, I think it would be nice to
886db9dad2f9 api_optimization - added reply to OD
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1188
diff changeset
146 provide an function that matches scipy's API.
886db9dad2f9 api_optimization - added reply to OD
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1188
diff changeset
147
1206
203569655069 API_optimization: Added note about apparently wrongly placed reply to discussion
Olivier Delalleau <delallea@iro>
parents: 1193
diff changeset
148 OD: Looks like the reply above was pasted in the wrong place... where was it
203569655069 API_optimization: Added note about apparently wrongly placed reply to discussion
Olivier Delalleau <delallea@iro>
parents: 1193
diff changeset
149 supposed to go?
203569655069 API_optimization: Added note about apparently wrongly placed reply to discussion
Olivier Delalleau <delallea@iro>
parents: 1193
diff changeset
150
1100
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
151 Examples
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
152 --------
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
153
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
154 Simple stochastic gradient descent could be called like this:
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
155
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
156 sgd([p], gradients=[g], step_size=.1)
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
157
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
158 and this would return
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
159
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
160 {p:p-.1*g}
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
161
1100
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
162
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
163 Simple stochastic gradient descent with extra updates:
153cf820a975 v2planning - updates to api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1069
diff changeset
164
1149
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
165 sgd([p], gradients=[g], updates={a:b}, step_size=.1)
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
166
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
167 will return
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
168
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
169 {a:b, p:p-.1*g}
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
170
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
171
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
172 If the parameters collide with keys in a given updates dictionary an exception
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
173 will be raised:
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
174
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
175 sgd([p], gradients=[g], updates={p:b}, step_size=.1)
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
176
7c5dc11c850a cleaning up api_optimization
James Bergstra <bergstrj@iro.umontreal.ca>
parents: 1108
diff changeset
177 will raise a KeyError.