annotate deep/stacked_dae/v_guillaume/sgd_optimization.py @ 647:47af8a002530 tip

changed Theano to ift6266 and remove numpy as we do not use code from numpy in this repository
author Razvan Pascanu <r.pascanu@gmail.com>
date Wed, 17 Oct 2012 09:26:14 -0400
parents 0ca069550abd
children
rev   line source
436
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
1 #!/usr/bin/python
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
2 # -*- coding: utf-8 -*-
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
3 # coding: utf-8
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
4
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
5 # Generic SdA optimization loop, adapted from the deeplearning.net tutorial
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
6
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
7 import numpy
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
8 import theano
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
9 import time
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
10 import datetime
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
11 import theano.tensor as T
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
12 import sys
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
13 #import pickle
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
14 import cPickle
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
15
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
16 from jobman import DD
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
17 import jobman, jobman.sql
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
18 from copy import copy
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
19
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
20 from stacked_dae import SdA
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
21
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
22 from ift6266.utils.seriestables import *
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
23
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
24 #For test purpose only
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
25 buffersize=1000
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
26
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
27 default_series = { \
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
28 'reconstruction_error' : DummySeries(),
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
29 'training_error' : DummySeries(),
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
30 'validation_error' : DummySeries(),
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
31 'test_error' : DummySeries(),
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
32 'params' : DummySeries()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
33 }
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
34
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
35 def itermax(iter, max):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
36 for i,it in enumerate(iter):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
37 if i >= max:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
38 break
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
39 yield it
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
40
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
41 class SdaSgdOptimizer:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
42 def __init__(self, dataset_name, dataset, hyperparameters, n_ins, n_outs,
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
43 examples_per_epoch, series=default_series, max_minibatches=None):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
44 self.dataset_name = dataset_name
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
45 self.dataset = dataset
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
46 self.hp = hyperparameters
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
47 self.n_ins = n_ins
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
48 self.n_outs = n_outs
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
49 self.parameters_pre=[]
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
50
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
51 if (self.dataset_name == "upper"):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
52 self.class_offset = 10
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
53 elif (self.dataset_name == "lower"):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
54 self.class_offset = 36
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
55 else:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
56 self.class_offset = 0
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
57
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
58
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
59 self.max_minibatches = max_minibatches
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
60 print "SdaSgdOptimizer, max_minibatches =", max_minibatches
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
61
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
62 self.ex_per_epoch = examples_per_epoch
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
63 self.mb_per_epoch = examples_per_epoch / self.hp.minibatch_size
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
64
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
65 self.series = series
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
66
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
67 self.rng = numpy.random.RandomState(1234)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
68
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
69 self.init_classifier()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
70
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
71 sys.stdout.flush()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
72
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
73 def init_classifier(self):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
74 print "Constructing classifier"
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
75
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
76 # we don't want to save arrays in DD objects, so
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
77 # we recreate those arrays here
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
78 nhl = self.hp.num_hidden_layers
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
79 layers_sizes = [self.hp.hidden_layers_sizes] * nhl
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
80 corruption_levels = [self.hp.corruption_levels] * nhl
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
81
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
82 # construct the stacked denoising autoencoder class
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
83 self.classifier = SdA( \
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
84 batch_size = self.hp.minibatch_size, \
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
85 n_ins= self.n_ins, \
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
86 hidden_layers_sizes = layers_sizes, \
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
87 n_outs = self.n_outs, \
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
88 corruption_levels = corruption_levels,\
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
89 rng = self.rng,\
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
90 pretrain_lr = self.hp.pretraining_lr, \
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
91 finetune_lr = self.hp.finetuning_lr)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
92
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
93 #theano.printing.pydotprint(self.classifier.pretrain_functions[0], "function.graph")
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
94
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
95 sys.stdout.flush()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
96
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
97 def train(self):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
98 self.pretrain(self.dataset)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
99 self.finetune(self.dataset)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
100
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
101 def pretrain(self,dataset,decrease=0):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
102 print "STARTING PRETRAINING, time = ", datetime.datetime.now()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
103 sys.stdout.flush()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
104
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
105 un_fichier=int(819200.0/self.hp.minibatch_size) #Number of batches in a P07 file
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
106
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
107 start_time = time.clock()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
108
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
109 ######## This is hardcoaded. THe 0.95 parameter is hardcoaded and can be changed at will ###
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
110 #Set the decreasing rate of the learning rate. We want the final learning rate to
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
111 #be 5% of the original learning rate. The decreasing factor is linear
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
112 decreasing = (decrease*self.hp.pretraining_lr)/float(self.hp.pretraining_epochs_per_layer*800000/self.hp.minibatch_size)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
113
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
114 ## Pre-train layer-wise
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
115 for i in xrange(self.classifier.n_layers):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
116 # go through pretraining epochs
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
117
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
118 #To reset the learning rate to his original value
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
119 learning_rate=self.hp.pretraining_lr
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
120 for epoch in xrange(self.hp.pretraining_epochs_per_layer):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
121 # go through the training set
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
122 batch_index=0
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
123 count=0
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
124 num_files=0
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
125 for x,y in dataset.train(self.hp.minibatch_size):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
126 y = y - self.class_offset
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
127 c = self.classifier.pretrain_functions[i](x,learning_rate)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
128 count +=1
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
129
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
130 self.series["reconstruction_error"].append((epoch, batch_index), c)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
131 batch_index+=1
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
132
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
133 #If we need to decrease the learning rate for the pretrain
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
134 if decrease != 0:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
135 learning_rate -= decreasing
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
136
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
137 # useful when doing tests
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
138 if self.max_minibatches and batch_index >= self.max_minibatches:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
139 break
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
140
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
141 #When we pass through the data only once (the case with P07)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
142 #There is approximately 800*1024=819200 examples per file (1k per example and files are 800M)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
143 if self.hp.pretraining_epochs_per_layer == 1 and count%un_fichier == 0:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
144 print 'Pre-training layer %i, epoch %d, cost '%(i,num_files),c
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
145 num_files+=1
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
146 sys.stdout.flush()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
147 self.series['params'].append((num_files,), self.classifier.all_params)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
148
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
149 #When NIST is used
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
150 if self.hp.pretraining_epochs_per_layer > 1:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
151 print 'Pre-training layer %i, epoch %d, cost '%(i,epoch),c
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
152 sys.stdout.flush()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
153
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
154 self.series['params'].append((epoch,), self.classifier.all_params)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
155
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
156 end_time = time.clock()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
157
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
158 print ('Pretraining took %f minutes' %((end_time-start_time)/60.))
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
159 self.hp.update({'pretraining_time': end_time-start_time})
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
160
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
161 sys.stdout.flush()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
162
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
163 #To be able to load them later for tests on finetune
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
164 self.parameters_pre=[copy(x.value) for x in self.classifier.params]
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
165 f = open('params_pretrain.txt', 'w')
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
166 cPickle.dump(self.parameters_pre,f,protocol=-1)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
167 f.close()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
168
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
169
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
170 def finetune(self,dataset,dataset_test,num_finetune,ind_test,special=0,decrease=0):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
171
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
172 if special != 0 and special != 1:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
173 sys.exit('Bad value for variable special. Must be in {0,1}')
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
174 print "STARTING FINETUNING, time = ", datetime.datetime.now()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
175
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
176 minibatch_size = self.hp.minibatch_size
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
177 if ind_test == 0 or ind_test == 20:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
178 nom_test = "NIST"
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
179 nom_train="P07"
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
180 else:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
181 nom_test = "P07"
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
182 nom_train = "NIST"
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
183
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
184
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
185 # create a function to compute the mistakes that are made by the model
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
186 # on the validation set, or testing set
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
187 test_model = \
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
188 theano.function(
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
189 [self.classifier.x,self.classifier.y], self.classifier.errors)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
190 # givens = {
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
191 # self.classifier.x: ensemble_x,
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
192 # self.classifier.y: ensemble_y]})
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
193
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
194 validate_model = \
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
195 theano.function(
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
196 [self.classifier.x,self.classifier.y], self.classifier.errors)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
197 # givens = {
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
198 # self.classifier.x: ,
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
199 # self.classifier.y: ]})
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
200
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
201
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
202 # early-stopping parameters
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
203 patience = 10000 # look as this many examples regardless
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
204 patience_increase = 2. # wait this much longer when a new best is
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
205 # found
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
206 improvement_threshold = 0.995 # a relative improvement of this much is
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
207 # considered significant
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
208 validation_frequency = min(self.mb_per_epoch, patience/2)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
209 # go through this many
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
210 # minibatche before checking the network
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
211 # on the validation set; in this case we
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
212 # check every epoch
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
213 if self.max_minibatches and validation_frequency > self.max_minibatches:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
214 validation_frequency = self.max_minibatches / 2
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
215
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
216 best_params = None
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
217 best_validation_loss = float('inf')
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
218 test_score = 0.
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
219 start_time = time.clock()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
220
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
221 done_looping = False
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
222 epoch = 0
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
223
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
224 total_mb_index = 0
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
225 minibatch_index = 0
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
226 parameters_finetune=[]
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
227
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
228 if ind_test == 21:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
229 learning_rate = self.hp.finetuning_lr / 10.0
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
230 else:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
231 learning_rate = self.hp.finetuning_lr #The initial finetune lr
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
232
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
233
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
234 while (epoch < num_finetune) and (not done_looping):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
235 epoch = epoch + 1
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
236
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
237 for x,y in dataset.train(minibatch_size,bufsize=buffersize):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
238 minibatch_index += 1
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
239
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
240 y = y - self.class_offset
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
241
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
242 if special == 0:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
243 cost_ij = self.classifier.finetune(x,y,learning_rate)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
244 elif special == 1:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
245 cost_ij = self.classifier.finetune2(x,y)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
246 total_mb_index += 1
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
247
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
248 self.series["training_error"].append((epoch, minibatch_index), cost_ij)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
249
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
250 if (total_mb_index+1) % validation_frequency == 0:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
251 #minibatch_index += 1
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
252 #The validation set is always NIST (we want the model to be good on NIST)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
253 if ind_test == 0 | ind_test == 20:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
254 iter=dataset_test.valid(minibatch_size,bufsize=buffersize)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
255 else:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
256 iter = dataset.valid(minibatch_size,bufsize=buffersize)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
257 if self.max_minibatches:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
258 iter = itermax(iter, self.max_minibatches)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
259 validation_losses = [validate_model(x,y - self.class_offset) for x,y in iter]
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
260 this_validation_loss = numpy.mean(validation_losses)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
261
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
262 self.series["validation_error"].\
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
263 append((epoch, minibatch_index), this_validation_loss*100.)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
264
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
265 print('epoch %i, minibatch %i, validation error on NIST : %f %%' % \
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
266 (epoch, minibatch_index+1, \
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
267 this_validation_loss*100.))
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
268
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
269
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
270 # if we got the best validation score until now
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
271 if this_validation_loss < best_validation_loss:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
272
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
273 #improve patience if loss improvement is good enough
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
274 if this_validation_loss < best_validation_loss * \
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
275 improvement_threshold :
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
276 patience = max(patience, total_mb_index * patience_increase)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
277
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
278 # save best validation score, iteration number and parameters
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
279 best_validation_loss = this_validation_loss
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
280 best_iter = total_mb_index
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
281 parameters_finetune=[copy(x.value) for x in self.classifier.params]
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
282
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
283 # test it on the test set
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
284 iter = dataset.test(minibatch_size,bufsize=buffersize)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
285 if self.max_minibatches:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
286 iter = itermax(iter, self.max_minibatches)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
287 test_losses = [test_model(x,y - self.class_offset) for x,y in iter]
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
288 test_score = numpy.mean(test_losses)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
289
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
290 #test it on the second test set
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
291 iter2 = dataset_test.test(minibatch_size,bufsize=buffersize)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
292 if self.max_minibatches:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
293 iter2 = itermax(iter2, self.max_minibatches)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
294 test_losses2 = [test_model(x,y - self.class_offset) for x,y in iter2]
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
295 test_score2 = numpy.mean(test_losses2)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
296
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
297 self.series["test_error"].\
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
298 append((epoch, minibatch_index), test_score*100.)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
299
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
300 print((' epoch %i, minibatch %i, test error on dataset %s (train data) of best '
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
301 'model %f %%') %
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
302 (epoch, minibatch_index+1,nom_train,
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
303 test_score*100.))
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
304
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
305 print((' epoch %i, minibatch %i, test error on dataset %s of best '
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
306 'model %f %%') %
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
307 (epoch, minibatch_index+1,nom_test,
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
308 test_score2*100.))
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
309
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
310 if patience <= total_mb_index:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
311 done_looping = True
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
312 break #to exit the FOR loop
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
313
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
314 sys.stdout.flush()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
315
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
316 # useful when doing tests
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
317 if self.max_minibatches and minibatch_index >= self.max_minibatches:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
318 break
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
319
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
320 if decrease == 1:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
321 if (ind_test == 21 & epoch % 100 == 0) | ind_test == 20 | (ind_test == 1 & epoch % 100 == 0) :
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
322 learning_rate /= 2 #divide the learning rate by 2 for each new epoch of P07 (or 100 of NIST)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
323
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
324 self.series['params'].append((epoch,), self.classifier.all_params)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
325
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
326 if done_looping == True: #To exit completly the fine-tuning
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
327 break #to exit the WHILE loop
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
328
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
329 end_time = time.clock()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
330 self.hp.update({'finetuning_time':end_time-start_time,\
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
331 'best_validation_error':best_validation_loss,\
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
332 'test_score':test_score,
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
333 'num_finetuning_epochs':epoch})
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
334
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
335 print(('\nOptimization complete with best validation score of %f %%,'
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
336 'with test performance %f %% on dataset %s ') %
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
337 (best_validation_loss * 100., test_score*100.,nom_train))
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
338 print(('The test score on the %s dataset is %f')%(nom_test,test_score2*100.))
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
339
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
340 print ('The finetuning ran for %f minutes' % ((end_time-start_time)/60.))
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
341
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
342 sys.stdout.flush()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
343
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
344 #Save a copy of the parameters in a file to be able to get them in the future
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
345
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
346 if special == 1: #To keep a track of the value of the parameters
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
347 f = open('params_finetune_stanford.txt', 'w')
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
348 cPickle.dump(parameters_finetune,f,protocol=-1)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
349 f.close()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
350
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
351 elif ind_test == 0 | ind_test == 20: #To keep a track of the value of the parameters
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
352 f = open('params_finetune_P07.txt', 'w')
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
353 cPickle.dump(parameters_finetune,f,protocol=-1)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
354 f.close()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
355
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
356
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
357 elif ind_test== 1: #For the run with 2 finetunes. It will be faster.
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
358 f = open('params_finetune_NIST.txt', 'w')
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
359 cPickle.dump(parameters_finetune,f,protocol=-1)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
360 f.close()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
361
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
362 elif ind_test== 21: #To keep a track of the value of the parameters
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
363 f = open('params_finetune_P07_then_NIST.txt', 'w')
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
364 cPickle.dump(parameters_finetune,f,protocol=-1)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
365 f.close()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
366
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
367
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
368 #Set parameters like they where right after pre-train or finetune
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
369 def reload_parameters(self,which):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
370
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
371 #self.parameters_pre=pickle.load('params_pretrain.txt')
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
372 f = open(which)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
373 self.parameters_pre=cPickle.load(f)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
374 f.close()
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
375 for idx,x in enumerate(self.parameters_pre):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
376 if x.dtype=='float64':
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
377 self.classifier.params[idx].value=theano._asarray(copy(x),dtype=theano.config.floatX)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
378 else:
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
379 self.classifier.params[idx].value=copy(x)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
380
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
381 def training_error(self,dataset):
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
382 # create a function to compute the mistakes that are made by the model
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
383 # on the validation set, or testing set
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
384 test_model = \
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
385 theano.function(
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
386 [self.classifier.x,self.classifier.y], self.classifier.errors)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
387
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
388 iter2 = dataset.train(self.hp.minibatch_size,bufsize=buffersize)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
389 train_losses2 = [test_model(x,y - self.class_offset) for x,y in iter2]
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
390 train_score2 = numpy.mean(train_losses2)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
391 print "Training error is: " + str(train_score2)
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
392
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
393
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
394
0ca069550abd Added : single class version of SDA
Guillaume Sicard <guitch21@gmail.com>
parents:
diff changeset
395