annotate doc/v2_planning/arch_src/plugin_JB_comments_RP.txt @ 1214:681b5e7e3b81

a few comments on James version
author Razvan Pascanu <r.pascanu@gmail.com>
date Wed, 22 Sep 2010 10:39:39 -0400
parents
children 5a8930e089ed
rev   line source
1214
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
1 I agree with Ian, maybe using caps is not the best idea. It reminds be of BASIC which I used to do long time ago :). It also makes the code look a bit scary.
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
2
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
3 I like the approach and I think it goes close to my earliest proposition and to what I am proposing for the layer committeee ( though we did not have a meeting yet).
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
4 I would though write it in a more Theano like ( Ian has a example of how that would look). I would also drop the CALL and FLIT constructs, and actually have a
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
5 decorator ( or something ) that wraps around a function to transform it into a call or flit. I hope that this is only syntactic sugar ( does this change anything
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
6 in the actual implementation ?? ) that makes things more natural. What I want to reach is something that looks very much as Theano, just that now you are creating
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
7 the graph of execution steps. Refractoring what you wrote this will look like
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
8
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
9 x = buffer_repeat( 1000, dataset.next())
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
10 train_pca = pca.analyze(x)
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
11
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
12 train_pca.run()
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
13
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
14 If you allow a FLIT to also get multiple inputs ( so not just the one) which comes natural in this way of writing you can get to describe a DAG that not only
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
15 describes the order of execution but also deals with what takes data from what. I'm sorry for not being there yesturday, from what I remember I have the
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
16 feeling that for you that is done under the hood and not taken care by this flow control structures.
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
17
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
18 To be a bit more explicit, in the way of writing the code above you can see that :
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
19 a) dataset_next() has to run before pca_analyze
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
20 b) pca_analyze needs the result (data) object of buffer_repeat( dataset.next())
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
21
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
22 I've actually elaborated on this idea here and there, and figured out what the result from such a control flow thing is, and how to make everything explicit
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
23 in the graph. Parts of this is in my plugin_RP.py ( Step 1) though it is a bit of a moving target. I also have a sligtly different way of writing REPEAT
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
24 and BUFFER_REPEAT .. though I think is mostly the same. I actually did not know how to deal with distributed things until I saw how you deal with that in your code.
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
25 Copy-pasted a version of a SDAA with my way of writing :
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
26
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
27 ## Layer 1:
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
28
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
29 data_x,data_y = GPU_transform(load_mnist())
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
30 noisy_data_x = gaussian_noise(data_x, amount = 0.1)
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
31 hidden1 = tanh(dotW_b(data_x, n_units = 200))
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
32 reconstruct1 = reconstruct(hidden1.replace(data_x, noisy_data_x),
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
33 noisy_data_x)
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
34 err1 = cross_entropy(reconstruct1, data_x)
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
35 learner1 = SGD(err1)
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
36
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
37 # Layer 2 :
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
38 noisy_hidden1 = gaussian_noise(hidden1, amount = 0.1)
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
39 hidden2 = tanh(dotW_b(hidden1, n_units = 200))
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
40 reconstruct2 = reconstruct(hidden2.replace(hidden1,noisy_hidden1),
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
41 noisy_hidden1)
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
42 err2 = cross_entropy(reconstruct2, hidden)
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
43 learner2 = SGD(err2)
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
44
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
45 # Top layer:
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
46
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
47 output = sigmoid(dotW_b(hidden2, n_units = 10))
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
48 err = cross_entropy(output, data_y)
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
49 learner = SGD(err)
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
50
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
51
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
52 GPU_transform,gaussian_noise and so on are functions that have been decorated ( or classes if you want)
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
53 that you would write using FLIT. Reconstruct for me is a different CONTROL FLOW element.
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
54 In this case I don't use REPEAT or BUFFER_REPEAT or the other very cool control flow elements, but you
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
55 can easily imagine writing something like
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
56
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
57 pretrained_in_parallel = weave( learner1, learner2)
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
58 results = spawn(repeat(5000,learner1),repeat(500,learner2))
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
59
681b5e7e3b81 a few comments on James version
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff changeset
60