Mercurial > pylearn
annotate doc/v2_planning/datalearn.txt @ 1390:746ebceeb46f
added comments to hmc code (old outstanding changes)
author | gdesjardins |
---|---|
date | Mon, 20 Dec 2010 18:08:04 -0500 |
parents | decee534c78d |
children | e8fc563dad74 |
rev | line source |
---|---|
1357
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
1 DataLearn: How to plug Datasets & Learner together? |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
2 =================================================== |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
3 |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
4 Participants |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
5 ------------ |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
6 - Yoshua |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
7 - Razvan |
1367
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
8 - Olivier D [leader] |
1357
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
9 |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
10 High-Level Objectives |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
11 --------------------- |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
12 |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
13 * Simple ML experiments should be simple to write |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
14 * More complex / advanced scenarios should be possible without being forced |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
15 to work "outside" of this framework |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
16 * Computations should be optimized whenever possible |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
17 * Existing code (in any language) should be "wrappable" within this |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
18 framework |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
19 * It should be possible to replace [parts of] this framework with C++ code |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
20 |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
21 Theano-Like Data Flow |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
22 --------------------- |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
23 |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
24 We want to rely on Theano to be able to take advantage of its efficient |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
25 computations. The general idea is that if we chain multiple processing |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
26 elements (think e.g. of a feature selection step followed by a PCA projection, |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
27 then a rescaling within a fixed bounded interval), the overall transformation |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
28 from input to output data can be represented by a Theano symbolic graph. When |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
29 one wants to access the actual numeric data, a function is compiled so as to |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
30 do these computations efficiently. |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
31 |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
32 We discussed some specific API options for datasets and learners, which will |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
33 be added to this file in the future, but a core question that we feel should |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
34 be addressed first is how this Theano-based implementation could be achieved |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
35 exactly. For this purpose, in the following, let us assume that a dataset is |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
36 simply a matrix whose rows represent individual samples, and columns |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
37 individual features. How to handle field names, non-tensor-like data, etc. is |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
38 a very important topic that is not yet discussed in this file. |
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
39 |
1367
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
40 A question we did not discuss much is to which extent the architecture could |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
41 be "theanified", i.e. whether a whole experiment could be defined as a Theano |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
42 graph on which high level optimizations could be made possible, while also |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
43 relying on Theano to "run" the graph. The other option is to use a different |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
44 mechanism, with underlying Theano graphs being built wherever possible to link |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
45 the various components of an experiment together. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
46 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
47 For now, let us consider the latter option, where each dataset contains a |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
48 pointer to a Theano variable that represents the data stored in this dataset. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
49 One issue with this approach is illustrated by the following example. Imagine |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
50 we want to iterate on samples in a dataset and do something with their numeric |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
51 value. We would want the code to be as close as possible to: |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
52 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
53 .. code-block:: python |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
54 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
55 for sample in dataset: |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
56 do_something_with(sample.numeric_value()) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
57 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
58 A naive implementation of the sample API could be (assuming each sample also |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
59 contains a ``variable`` member which is the variable representing this |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
60 sample's data): |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
61 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
62 .. code-block:: python |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
63 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
64 def numeric_value(self): |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
65 if self.function is None: |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
66 # Compile function to output the numeric value stored in this |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
67 # sample's variable. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
68 self.function = theano.function([], self.variable) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
69 return self.function() |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
70 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
71 However, this is not a good idea, because it would trigger a new function |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
72 compilation for each sample. Instead, we would want something like this: |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
73 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
74 .. code-block:: python |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
75 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
76 def numeric_value(self): |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
77 if self.function_storage[0] is None: |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
78 # Compile function to output the numeric value stored in this |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
79 # sample's variable. This function takes as input the index of |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
80 # the sample in the dataset, and is shared among all samples. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
81 self.function_storage[0] = theano.function( |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
82 [self.symbolic_index], self.variable) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
83 return self.function(self.numeric_index) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
84 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
85 In the code above, we assume that all samples created by the action of |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
86 iterating over the dataset share the same ``function_storage``, |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
87 ``symbolic_index`` and ``variable``: the first time we try to access the numeric |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
88 value of some sample, a function is compiled, that takes as input the index, |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
89 and outputs the variable. The only difference between samples is thus that |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
90 they are given a different numeric value for the index (``numeric_index``). |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
91 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
92 Another way to obtain the same result is to actually let the user take care of |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
93 compiling the function. It would allow the user to really control what is |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
94 being compiled, at the cost of having to write more code: |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
95 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
96 .. code-block:: python |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
97 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
98 symbolic_index = dataset.get_index() # Or just theano.tensor.iscalar() |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
99 get_sample = theano.function([symbolic_index], |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
100 dataset[symbolic_index].variable) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
101 for numeric_index in xrange(len(dataset)) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
102 do_something_with(get_sample(numeric_index)) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
103 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
104 James comments: this is how I have written the last couple of projects, it's |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
105 slightly verbose but it's clear and efficient. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
106 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
107 The code above may also be simplified by providing helper functions. In the |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
108 example above, such a function could allow us to iterate on the numeric values |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
109 of samples in a dataset while taking care of compiling the appropriate Theano |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
110 function. See Discussion: Helper Functions below. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
111 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
112 Note that although the above example focused on how to iterate over a dataset, |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
113 it can be cast into a more generic problem, where some data (either dataset or |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
114 sample) is the result of some transformation applied to other data, which is |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
115 parameterized by parameters p1, p2, ..., pN (in the above example, we were |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
116 considering a sample that was obtained by taking the p1-th element in a |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
117 dataset). If we use different values for a subset Q of the parameters but keep |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
118 other parameters fixed, we would probably want to compile a single function |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
119 that takes as input all parameters in Q, while other parameters are fixed. It |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
120 may be nice to try and get the best of both worlds, letting the user take |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
121 control on what is being compiled, while leaving the option of using a default |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
122 sensible behavior for those who do not want to worry about it. Whether this is |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
123 possible / desirable is still to-be-determined. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
124 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
125 What About Learners? |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
126 -------------------- |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
127 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
128 The discussion above only mentioned datasets, but not learners. The learning |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
129 part of a learner is not a main concern (currently). What matters most w.r.t. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
130 what was discussed above is how a learner takes as input a dataset and outputs |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
131 another dataset that can be used with the dataset API. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
132 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
133 A Learner may be able to compute various things. For instance, a Neural |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
134 Network may output a ``prediction`` vector (whose elements correspond to |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
135 estimated probabilities of each class in a classification task), as well as a |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
136 ``cost`` vector (whose elements correspond to the penalized NLL, the NLL alone |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
137 and the classification error). We would want to be able to build a dataset |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
138 that contains some of these quantities computed on each sample in the input |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
139 dataset. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
140 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
141 The Neural Network code would then look something like this: |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
142 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
143 .. code-block:: python |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
144 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
145 class NeuralNetwork(Learner): |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
146 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
147 # The decorator below is reponsible for turning a function that |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
148 # takes a symbolic sample as input, and outputs a Theano variable, |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
149 # into a function that can also be applied on numeric sample data, |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
150 # or symbolic datasets. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
151 # Other approaches than a decorator are possible (e.g. using |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
152 # different function names). |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
153 @datalearn(..) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
154 def compute_prediction(self, sample): |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
155 return softmax(theano.tensor.dot(self.weights, sample.input)) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
156 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
157 @datalearn(..) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
158 def compute_nll(self, sample): |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
159 return - log(self.compute_prediction(sample)[sample.target]) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
160 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
161 @datalearn(..) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
162 def compute_penalized_nll(self, sample): |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
163 return (self.compute_nll(self, sample) + |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
164 theano.tensor.sum(self.weights**2)) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
165 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
166 @datalearn(..) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
167 def compute_class_error(self, sample): |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
168 probabilities = self.compute_prediction(sample) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
169 predicted_class = theano.tensor.argmax(probabilities) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
170 return predicted_class != sample.target |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
171 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
172 @datalearn(..) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
173 def compute_cost(self, sample): |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
174 return theano.tensor.concatenate([ |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
175 self.compute_penalized_nll(sample), |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
176 self.compute_nll(sample), |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
177 self.compute_class_error(sample), |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
178 ]) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
179 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
180 The ``@datalearn`` decorator would allow such a Learner to be used e.g. like |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
181 this: |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
182 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
183 .. code-block:: python |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
184 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
185 nnet = NeuralNetwork() |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
186 # Symbolic dataset that represents the output on symbolic input data. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
187 predict_dataset = nnet.compute_prediction(dataset) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
188 for sample in dataset: |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
189 # Symbolic sample that represents the output on a single symbolic |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
190 # input sample. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
191 predict_sample = nnet.compute_prediction(sample) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
192 # Numeric prediction. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
193 predict_numeric = nnet.compute_prediction({'input': numpy.zeros(10)}) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
194 # Combining multiple symbolic outputs. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
195 multiple_fields_dataset = ConcatDataSet([ |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
196 nnet.compute_prediction(dataset), |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
197 nnet.compute_cost(dataset), |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
198 ]) |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
199 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
200 In the code above, if one wants to obtain the numeric value of an element of |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
201 ``multiple_fields_dataset``, the Theano function being compiled should be able |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
202 to optimize computations so that the simultaneous computation of |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
203 ``prediction`` and ``cost`` is done efficiently. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
204 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
205 Discussion: Are Datasets Variables / Ops? |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
206 ----------------------------------------- |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
207 |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
208 OD wonders: Should datasets directly be Theano Variables, or should they be a |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
209 different object subclass containing a Theano Variable? The advantage of the |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
210 former option would be that they would fit directly within the Theano |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
211 framework, which may allow high level optimizations on data transformations. |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
212 However, we would lose the ability to combine Theano expressions coded in |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
213 individual datasets into a single graph. Currently, I instead considered that |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
214 a dataset has a member that is a Theano variable, and this variable represents |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
215 the data stored in the dataset. The same is done for individual data samples. |
1357
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
216 |
1359
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
217 James asks: Why would a Theano graph in which some nodes represent datasets give |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
218 up the ability to combine Theano expressions coded in individual datasets? |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
219 Firstly, if you want to use Theano expressions and compiled functions to |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
220 implement the perform() method of an Op, you can do that. Secondly, you can |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
221 just include those 'expressions coded in individual datasets' into the overall |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
222 graph. |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
223 |
1362
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
224 OD replies to James: What I had in mind is you would be forced to compile your |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
225 own function inside the perform() method of an Op. This seemed like a |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
226 potential problem to me because it would prevent Theano from seeing the whole |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
227 fine-grained graph and do optimizations across multiple dataset |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
228 transformations (there may also be additional overhead from calling multiple |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
229 function). But if you are saying it is possible to include 'expressions coded |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
230 in individual datasets' into the overall graph, then I guess this point is |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
231 moot. Would this be achieved with an optimization that replaces the dataset |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
232 node with its internal graph? |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
233 |
1361
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
234 Razvan comments: 1) Having Theano expressions inside the perform of a Theano |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
235 Op can lead to issues. I know I had to deal with a few when implementing |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
236 Scan which does exactly this. Well to be fair these issues mostly come into |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
237 play when the inner graph has to interact with the outer graph and most of |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
238 the time they can be solved. I guess all that I'm saying is going that way |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
239 might lead to some head-ache to developers, though I guess some head-ache |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
240 will be involved no matter what |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
241 2) In my view (I'm not sure this is what Olivier was saying) the idea of |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
242 not putting the Dataset into a Variable is to not put the logic related to |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
243 loading data, dividing it into slices when running it on the GPU and so on |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
244 into a theano variable. In my view this logic goes into a DataSet class |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
245 that gives you shared variables, symbolic indices into that shared |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
246 variables, and also numeric indices. When looping through those numeric |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
247 indices, the dataset class can reload parts of the data into the |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
248 shared variable and so on. |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
249 |
1362
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
250 OD replies to Razvan's point 2: I think what you are saying is another concern |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
251 I had, which was the fact it may be confusing to mix in the same class the |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
252 Variable/Op and DataSet interfaces. I would indeed prefer to keep them |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
253 separate. However, it may be possible to come up with a system that would get |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
254 the best of both worlds (maybe by having the Op/Variable as members of |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
255 Dataset, and just asking the user building a theano graph to use these instead |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
256 of the dataset directly). Note that I'm mixing up Op/Variable here, because |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
257 it's just not clear yet for me which would go where... |
1361
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
258 |
1357
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
259 |
1367
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
260 Discussion: Implicit / Explicit Function Compilation |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
261 ---------------------------------------------------- |
1359
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
262 |
1361
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
263 <Razvan comments>: I assume that ``do_something_with`` is suppose to be some |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
264 numeric function, and dataset in this case is the result of some |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
265 computations on a initial dataset. |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
266 I would differentiate the two approaches (1) and (2) as : |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
267 - first of all whatever you can do with (1) you can do with (2) |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
268 - approach (1) hides the fact that you are working with symbolic graphs. |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
269 You apply functions to datasets, and when you want to see values a |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
270 function is compiled under the hood and those values are computed for |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
271 you. In approach (2) the fact that you deal with a symbolic graph is |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
272 explicit because you have to manually compile your functions. |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
273 - approach (1) needs to use this function_storage trick shared between |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
274 certain nodes of the graph to reduce the number of compilation while in |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
275 approach (2) we don't need to deal with the complexity of lazy |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
276 compilation |
1362
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
277 |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
278 OD comments: Well, to be fair, it means we put the burden of dealing with the |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
279 complexity of lazy compilation on the user (it's up to him to make sure he |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
280 compiles only one function). |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
281 |
1361
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
282 - approach (1) needs a replace function if you want to change the dataset. |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
283 What you would do, is once you have a "computational graph" or pipeline |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
284 or whatever you call it, say ``graph``, to change the input you would do |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
285 graph.replace({ init_data_X: new_data_X}), In approach (2) the init_data_X |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
286 and new_data_X is the ``dataset`` so you would compile two different |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
287 functions. Well I would re-write (2) -- to make the above more clear -- |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
288 as : |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
289 |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
290 .. code-block:: python |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
291 |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
292 symbolic_index = theano.tensor.iscalar() |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
293 get_sample1 = theano.function( [symbolic_index], |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
294 graph( dataset[symbolic_index] ).variable) |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
295 for numeric_index in xrange(len(dataset)): |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
296 do_something_with(get_sample(numeric_index)) |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
297 |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
298 get_sample2 = theano.function( [symbolic_index], |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
299 graph( new_dataset[symbolic_index] ).variable) |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
300 ## Note: the dataset was replaced with new_dataset |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
301 for numeric_index in xrange(len(new_dataset)): |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
302 do_something_with(get_sample2(numeric_index)) |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
303 |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
304 ######### FOR (1) you write: |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
305 |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
306 for datapoint in graph: |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
307 do_something_with( datapoint() ) |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
308 |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
309 new_graph = graph.replace({dataset:dataset2}) |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
310 |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
311 for datapoint in new_graph: |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
312 do_something_with(datapoint()) |
1362
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
313 |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
314 OD comments: I don't really understand what is 'graph' in this code (it |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
315 appears in both approaches but is used differently). What I have in mind would |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
316 be more with 'graph' removed in the first approach you describe (#2), and |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
317 graph / new_graph replaced by dataset / new_dataset in the second one (#1). |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
318 You wouldn't need to call some graph.replace method: the graphs compiled for |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
319 iterating on 'dataset' and 'new_dataset' would be entirely separate (using two |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
320 different compiled functions, pretty much like #2). |
1363
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
321 |
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
322 RP answers: Yes you are right. What I was trying to say is if you have two |
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
323 different datasets on which you want to apply the same pre-processing you |
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
324 can do that in both approaches. ``graph`` represents the pre-processing |
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
325 steps in (2) and the end dataset (after preprocessing) in (1). So the idea |
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
326 is that instead of making new_graph from scratch (re-applying all the |
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
327 transforms on the original dataset) you can use replace. Or maybe the |
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
328 __call__ (that compiles the function if needed) can get a givens dictionary |
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
329 ( that replaces datasets or more ). I only gave this argument because I |
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
330 thought this will be an issue people will raise. They will say, well in (2) |
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
331 the pipeline logic is separated from the data, so you can use the same |
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
332 transformation with different data easily, while in (1) you write the |
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
333 transformation rooted in a dataset, and if you want same transformation |
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
334 for a different dataset you have to re-write everything. |
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
335 |
1364 | 336 OD replies: Still not sure I understand. If you have a "graph" function that |
337 takes a dataset as input and outputs a new dataset, you can use this same | |
338 function with both (1) and (2). With (2) it is: | |
339 theano.function([index], graph(my_dataset)[index].variable) | |
340 while with (1) the same function is compiled implicitly with: | |
341 for sample in graph(my_dataset): | |
342 ... | |
1363
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
343 |
1365 | 344 RP answers: right. I was actually constructing this stupid example in my mind when |
345 you would do like : | |
346 i1 = f1(data) | |
347 i2 = f2(i1) | |
348 i3 = f3(i2) | |
349 ... | |
350 iN = fN(iN-1) | |
351 and then you would say .. wait I want to do this on new_data as well. Oh no, I | |
352 have to copy the entire block or whatever. That is so annoying. But actually you | |
353 could just write: | |
354 | |
355 def my_f(data): | |
356 i1 = f1(data) | |
357 ... | |
358 return iN | |
359 | |
360 and then just use that function which is what you pointed out. I agree I'm | |
361 not sure anymore on the point that I was trying to make. Is like if you are | |
362 a lazy programmer, and you write everything without functions, you can | |
363 argue that you like more (2) because you only pass the dataset at the end | |
364 and not at the beginning. But if (1) would have the replace function this | |
365 argument will fail. Though this only stands if you like don't want to make | |
366 a function out of your pipeline that takes the dataset as input, which now | |
367 that I think about it is pretty stupid not to do. Sorry for that. | |
368 | |
369 | |
1361
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
370 - in approach (1) the initial dataset object (the one that loads the data) |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
371 decides if you will use shared variables and indices to deal with the |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
372 dataset or if you will use ``theano.tensor.matrix`` and not the user( at |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
373 least not without hacking the code). Of course whoever writes that class |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
374 can add a flag to it to switch between behaviours that make sense. |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
375 In approach (2) one is not forced to do this |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
376 inside that class by construction, though by convention I would do it. |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
377 So if you consider the one who writes that class as a developer than |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
378 in (2) the user can decide/deal with this and not the developer. |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
379 Though this is a fine-line -- I would say the user would actually |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
380 write that class as well using some template. |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
381 That is to say (2) looks and feels more like working with Theano |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
382 directly, |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
383 |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
384 Bottom line, I think (1) puts more stress on the development of the library, |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
385 and hides Theano and some of the complexity for day to day usage. |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
386 In (2) everything is a bit more explicit, leaving the impression that you |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
387 have more control over the code, though I strongly feel that whatever can |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
388 be done in (2) can be done in (1). Traditionally I was more inclined |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
389 towards (1) but now I'm not that sure, I think both are equally interesting |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
390 and valid options. |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
391 </Razvan comments> |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
392 |
1367
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
393 Discussion: Fixed Parameters vs. Function Arguments |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
394 --------------------------------------------------- |
1357
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
395 |
1361
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
396 Razvan Comment: I thought about this a bit at the Pylearn level. In my |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
397 original train of thought you would have the distinction between ``hand |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
398 picked parameters`` which I would call hyper-parameter and learned |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
399 parameters. A transformation in this framework (an op if you wish) could |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
400 take as inputs DataSet(s), DataField(s), Parameter(s) (which are the things |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
401 that the learner should adapt) and HyperParameter(s). All hyper-parameters |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
402 will turn into arguments of the compiled function (like the indices of each |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
403 of the dataset objects ) and therefore they can be changed without |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
404 re-compilation. Or in other words this can be easily done by having new |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
405 types of Variables that would represent Parameters and Hyper-parameters. |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
406 And as an ending note I would say that there are |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
407 hyper-parameters for which you need to recompile the thenao function and |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
408 can not be just parameters ( so we would have yet another category ?). |
1359
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
409 |
1366 | 410 Yoshua's comments on RP's comments: I don't understand why we would |
411 need to create these types. Isn't it just a matter for the programmer | |
412 to decide what are the inputs of the compiled function, and which | |
413 are possibly constant (e.g. holding some hyper-parameters constant | |
414 for a while)? | |
415 | |
1368
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
416 RP answers: If we opt for this lazy compilation mechanism, the library needs |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
417 to know what to put into a shared, and what to expect as input. The |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
418 programmer should give hints to the library by saying this value will always |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
419 be constant, or this is a hyper-parameter that I might want to change, and |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
420 when I do that I don't want to recompile everything so put it as an |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
421 argument. Even when the compilation is done by the user, it would be helpful |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
422 to have some function that collects all the parameters for you. What I mean |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
423 is that it would be nice to write something like |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
424 |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
425 corruption_layer_1 = Parameter ( value = 0.1, name = 'c1') |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
426 # Followed by (many) lines of code |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
427 f = function ( results.inputs()+ results.hyper-params(), result ) |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
428 |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
429 |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
430 where results.hyper-params parses the graph, collects the hyper-parameter |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
431 and returns them as a list of theano.Variables wrappen in theano.In with |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
432 a default value and a name. You could call the function either as |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
433 |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
434 f() |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
435 or |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
436 f(c1 = 0.2) |
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
437 |
1370
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
438 OD comments: Here is a (hopefully simpler) suggestion to solve this problem. |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
439 Consider any data{set,point} obtained by a transformation of an existing |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
440 data{set,point} with parameters p1, p2, ..., pN. From the point of view of |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
441 theano variables, this is something like x2 = h(x1, p1=v1, ..., pn=vN) where |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
442 x1, x2 are variables and h is an Op. In addition v1 ... vN are also variables |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
443 since they are parameters of the transformation we may want to vary. This is |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
444 not, however, the way the user would build the graph, because being forced to |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
445 use variables for parameters is not user-friendly (IMO). Instead, someone |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
446 would write: |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
447 d2 = t(d1, p1=w1, ..., pn=wN) |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
448 where d1, d2 are data{set,point}s, t is the transformation, and w1 ... wN are |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
449 numeric values of the parameters. Then t would build the piece of graph above, |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
450 so that when you ask d2.numeric_value(), a function computing x2 would be |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
451 compiled, that would take as input variables v1, ... vN. |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
452 Now, the problem is that this may not be fully optimized, since parameters are |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
453 assumed to be varying (so as not to be forced to recompile a different |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
454 function when the user calls t with different parameter values). My suggestion |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
455 is to make this the default behavior, but add an extra argument to t: |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
456 d2 = t(d1, p1=w1, ..., pn=Wn, constants=['p3', 'p5']) |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
457 The line above would do the same, except that the function being compiled |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
458 would use the constant values w3 and w5 for p3 and p5. |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
459 Razvan's example above would be written in a different way as follows: |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
460 def f(c1=0.2): |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
461 return transformK(..(transform2(transform1(input_data, |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
462 corruption_layer_1=c1)))) |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
463 With this code you could create various transformed datasets by callling f |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
464 with different values for c1. The first time you call f(c1=0).numeric_value() |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
465 a Theano function is compiled that takes a `corruption_layer_1` input variable |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
466 (whose value is 0 when the function is called by `numeric_value`). If you call |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
467 f().numeric_value(), the same function is re-used (no need to compile it) with |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
468 this input set to 0.2. If on another hand you want to compile a new function |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
469 for each new value of your `corruption_layer_1` parameter, you would instead |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
470 write: |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
471 def f(c1=0.2): |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
472 return transformK(..(transform2(transform1(input_data, |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
473 corruption_layer_1=c1, |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
474 constants=['corruption_layer_1'])))) |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
475 This would be one way to have automatic lazy function cache / compilation |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
476 while still letting the user specify for which parameters a new function needs |
5785cbac3361
Added a suggestion to solve the problem of Fixed vs. Varying parameters
Olivier Delalleau <delallea@iro>
parents:
1369
diff
changeset
|
477 to be compiled when their value changes. |
1368
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
478 |
1371 | 479 RP comment : What about the same trick that Theano uses, namely, if you want |
480 a non "default" behaviour you wrap the input in a dictionary. You would | |
481 write tranform1( input_data, | |
482 corruption_layer_1= In(value = c1, fixed = True)) ? | |
483 I started to like this approach of passing extra info about an argument :). | |
484 Other that this it sounds good to me. | |
485 | |
1372 | 486 OD replies: Yes, I guess it would make sense. The more I look at it, the more |
487 it seems like it is very close to directly writing a Theano transform on some | |
488 variables. | |
489 | |
1368
ad53f73020c2
Answered Yoshua's question
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1367
diff
changeset
|
490 |
1367
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
491 Discussion: Helper Functions |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
492 ---------------------------- |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
493 |
1362
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
494 James: Another syntactic option for iterating over datasets is |
1359
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
495 |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
496 .. code-block:: python |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
497 |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
498 for sample in dataset.numeric_iterator(batchsize=10): |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
499 do_something_with(sample) |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
500 |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
501 The numeric_iterator would create a symbolic batch index, and compile a single function |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
502 that extracts the corresponding minibatch. The arguments to the |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
503 numeric_iterator function can also specify what compile mode to use, any givens |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
504 you might want to apply, etc. |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
505 |
1366 | 506 Yoshua's comment to James' comment: I like that approach. |
507 | |
1362
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
508 OD comments: Would there also be some kind of function cache to avoid |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
509 compiling the same function again if we re-iterate on the same dataset with |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
510 the same arguments? Maybe a more generic issue is: would there be a way for |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
511 Theano to be more efficient when re-compiling the same function that was |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
512 already compiled in the same program? (note that I am assuming here it is not |
1363
18b2ebec6bca
Reply to a comment of OD
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1362
diff
changeset
|
513 efficient, but I may be wrong). |
1359
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
514 |
1369
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
515 OD adds: After thinking more about it, this seems very close to my first |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
516 version where a function is automatically compiled "under the hood" when |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
517 iterating on a dataset and accessing the numeric value of a resulting |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
518 sample. The main differences are: |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
519 - In your version, the result is directly a numeric value, while in my version |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
520 one would obtain symbolic samples and would need to call some method to |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
521 obtain their numeric value. I think I like mine a bit better because it |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
522 means you can use the same syntax to e.g. iterate on a dataset, whether you |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
523 are interested in the symbolic representation of samples, or their numeric |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
524 values. On another hand, doing so could be less efficient since you create an |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
525 intermediate representation you may not use. The overhead does not seem much |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
526 to me but I am not sure about that. |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
527 - In your version, you can provide to the function e.g. compile modes / |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
528 givens. This could probably also be done in my version, although it makes it |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
529 more difficult if you want to cache the function to avoid compiling it more |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
530 than once (see next point). |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
531 - (Related to my first comment above) In your version it seems like a new |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
532 function would be compiled every time the user calls e.g. |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
533 'numeric_iterator', while in my version the function would be compiled only |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
534 once. Maybe this can be solved at the Theano level with an efficient |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
535 function cache? |
f3a549bd8688
datalearn: Added another comment on James' numeric iterator function
Olivier Delalleau <delallea@iro>
parents:
1368
diff
changeset
|
536 |
1367
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
537 Discussion: Dataset as Learner Ouptut |
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
538 ------------------------------------- |
1357
ffa2932a8cba
Added datalearn committee discussion file
Olivier Delalleau <delallea@iro>
parents:
diff
changeset
|
539 |
1359
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
540 James asks: |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
541 What's wrong with simply passing the variables corresponding to the dataset to |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
542 the constructor of the learner? |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
543 That seems much more flexible, compact, and clear than the decorator. |
5db730bb0e8e
comments on datalearn
James Bergstra <bergstrj@iro.umontreal.ca>
parents:
1357
diff
changeset
|
544 |
1362
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
545 OD replies: Not sure I understand your idea here. We probably want a learner |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
546 to be able to compute its output on multiple datasets, without having to point |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
547 to these datasets within the learner itself (which seems cumbersome to me). |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
548 The point of the decorators is mostly to turn a single function (that outputs |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
549 a theano variable for the ouptut computed on a single sample) into a function |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
550 that can compute symbolic datasets as well as numeric sample outputs. Those |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
551 could also be instead different functions in the base Learner class if the |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
552 decorator approach is considered ugly / confusing. |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
553 |
1361
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
554 Razvan asks: What is predict_sample for ? What is predict_dataset? What I |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
555 guess you mean is that the decorator is used to convert a function that |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
556 takes a theano variable and outputs a theano variable into a class/function |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
557 that takes a DataField/DataSet and outputs a DataField/DataSet. It could |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
558 also register all those different functions, so that the Dataset that |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
559 you get out of (not one of the function) the entire Learner (this Dataset |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
560 is returned by __call__) would contain all those as fields. |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
561 I would use it like this: |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
562 |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
563 .. code-block:: python |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
564 |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
565 nnet = NeuralNetwork() |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
566 results = nnet(dataset) |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
567 for datapoint in results: |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
568 print datapoint.prediction, datapoint.nll, ... |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
569 |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
570 Is this close to what you are suggesting? |
7548dc1b163c
Some question/suggestions to datalearn
Razvan Pascanu <r.pascanu@gmail.com>
parents:
1359
diff
changeset
|
571 |
1362
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
572 OD: Yes, you guessed right, the decorator's role is to do something different |
6b9673d72a41
Datalearn replies / comments
Olivier Delalleau <delallea@iro>
parents:
1361
diff
changeset
|
573 depending on the input to the function (see my reply to James above). |
1367
9474fb4ad109
Refactored datalearn committee file to be easier to read
Olivier Delalleau <delallea@iro>
parents:
1366
diff
changeset
|
574 |