changeset 1262:4d7fdd04b66a

More RSTification
author Pascal Lamblin <lamblinp@iro.umontreal.ca>
date Tue, 28 Sep 2010 14:02:29 -0400
parents 93e1c7c9172b
children 10113a1050ce
files doc/v2_planning/API_optimization.txt
diffstat 1 files changed, 17 insertions(+), 7 deletions(-) [+]
line wrap: on
line diff
--- a/doc/v2_planning/API_optimization.txt	Mon Sep 27 10:59:00 2010 -0400
+++ b/doc/v2_planning/API_optimization.txt	Tue Sep 28 14:02:29 2010 -0400
@@ -133,7 +133,7 @@
         def __init__(self, x0, f, df, opt_algo, **kwargs):
             """Initialize state (arguments as in minimize())
             """
-        def __iter__(self): 
+        def __iter__(self):
             return self
         def next(self):
             """Take a step of minimization and return self raises StopIteration when
@@ -153,18 +153,26 @@
 
 Simple stochastic gradient descent could be called like this:
 
-    sgd([p], gradients=[g], step_size=.1) 
+.. code-block:: python
+
+    sgd([p], gradients=[g], step_size=.1)
 
 and this would return
 
+.. code-block:: python
+
     {p:p-.1*g}
 
 
 Simple stochastic gradient descent with extra updates:
 
-    sgd([p], gradients=[g], updates={a:b}, step_size=.1) 
+.. code-block:: python
+
+    sgd([p], gradients=[g], updates={a:b}, step_size=.1)
 
-will return 
+will return
+
+.. code-block:: python
 
     {a:b, p:p-.1*g}
 
@@ -172,6 +180,8 @@
 If the parameters collide with keys in a given updates dictionary an exception
 will be raised:
 
-    sgd([p], gradients=[g], updates={p:b}, step_size=.1) 
-    
-will raise a KeyError.
+.. code-block:: python
+
+    sgd([p], gradients=[g], updates={p:b}, step_size=.1)
+
+will raise a ``KeyError``.