Mercurial > pylearn
comparison test_dataset.py @ 240:97f35d586727
more test. Are we wanting to limit the size of minibatch to the size of the dataset?
author | Frederic Bastien <bastienf@iro.umontreal.ca> |
---|---|
date | Thu, 29 May 2008 10:42:29 -0400 |
parents | 77b362a23f8e |
children | 0fb75fdd727d |
comparison
equal
deleted
inserted
replaced
239:77b362a23f8e | 240:97f35d586727 |
---|---|
224 assert (numpy.append(x[id],y[id])==array[(i+4)%array.shape[0]]).all() | 224 assert (numpy.append(x[id],y[id])==array[(i+4)%array.shape[0]]).all() |
225 i+=1 | 225 i+=1 |
226 assert i==m.n_batches*m.minibatch_size | 226 assert i==m.n_batches*m.minibatch_size |
227 del x,y,i,id | 227 del x,y,i,id |
228 | 228 |
229 #@todo: we can't do minibatch bigger then the size of the dataset??? | |
230 assert have_raised2(ds.minibatches,['x','y'],n_batches=1,minibatch_size=len(array)+1,offset=0) | |
231 assert not have_raised2(ds.minibatches,['x','y'],n_batches=1,minibatch_size=len(array),offset=0) | |
229 | 232 |
230 def test_ds_iterator(array,iterator1,iterator2,iterator3): | 233 def test_ds_iterator(array,iterator1,iterator2,iterator3): |
231 l=len(iterator1) | 234 l=len(iterator1) |
232 i=0 | 235 i=0 |
233 for x,y in iterator1: | 236 for x,y in iterator1: |