Mercurial > pylearn
changeset 1370:5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
author | Olivier Delalleau <delallea@iro> |
---|---|
date | Mon, 15 Nov 2010 16:17:59 -0500 |
parents | f3a549bd8688 |
children | 98d4232df1d8 |
files | doc/v2_planning/datalearn.txt |
diffstat | 1 files changed, 40 insertions(+), 0 deletions(-) [+] |
line wrap: on
line diff
--- a/doc/v2_planning/datalearn.txt Mon Nov 15 15:20:49 2010 -0500 +++ b/doc/v2_planning/datalearn.txt Mon Nov 15 16:17:59 2010 -0500 @@ -435,6 +435,46 @@ or f(c1 = 0.2) +OD comments: Here is a (hopefully simpler) suggestion to solve this problem. +Consider any data{set,point} obtained by a transformation of an existing +data{set,point} with parameters p1, p2, ..., pN. From the point of view of +theano variables, this is something like x2 = h(x1, p1=v1, ..., pn=vN) where +x1, x2 are variables and h is an Op. In addition v1 ... vN are also variables +since they are parameters of the transformation we may want to vary. This is +not, however, the way the user would build the graph, because being forced to +use variables for parameters is not user-friendly (IMO). Instead, someone +would write: + d2 = t(d1, p1=w1, ..., pn=wN) +where d1, d2 are data{set,point}s, t is the transformation, and w1 ... wN are +numeric values of the parameters. Then t would build the piece of graph above, +so that when you ask d2.numeric_value(), a function computing x2 would be +compiled, that would take as input variables v1, ... vN. +Now, the problem is that this may not be fully optimized, since parameters are +assumed to be varying (so as not to be forced to recompile a different +function when the user calls t with different parameter values). My suggestion +is to make this the default behavior, but add an extra argument to t: + d2 = t(d1, p1=w1, ..., pn=Wn, constants=['p3', 'p5']) +The line above would do the same, except that the function being compiled +would use the constant values w3 and w5 for p3 and p5. +Razvan's example above would be written in a different way as follows: + def f(c1=0.2): + return transformK(..(transform2(transform1(input_data, + corruption_layer_1=c1)))) +With this code you could create various transformed datasets by callling f +with different values for c1. The first time you call f(c1=0).numeric_value() +a Theano function is compiled that takes a `corruption_layer_1` input variable +(whose value is 0 when the function is called by `numeric_value`). If you call +f().numeric_value(), the same function is re-used (no need to compile it) with +this input set to 0.2. If on another hand you want to compile a new function +for each new value of your `corruption_layer_1` parameter, you would instead +write: + def f(c1=0.2): + return transformK(..(transform2(transform1(input_data, + corruption_layer_1=c1, + constants=['corruption_layer_1'])))) +This would be one way to have automatic lazy function cache / compilation +while still letting the user specify for which parameters a new function needs +to be compiled when their value changes. Discussion: Helper Functions