Mercurial > pylearn
diff mlp_factory_approach.py @ 265:ae0a8345869b
commented junk in the default test (main function) of mlp_factory_approach so the test still works
author | Thierry Bertin-Mahieux <bertinmt@iro.umontreal.ca> |
---|---|
date | Wed, 04 Jun 2008 17:34:01 -0400 |
parents | a1793a5e9523 |
children | eded3cb54930 |
line wrap: on
line diff
--- a/mlp_factory_approach.py Wed Jun 04 17:00:44 2008 -0400 +++ b/mlp_factory_approach.py Wed Jun 04 17:34:01 2008 -0400 @@ -317,20 +317,20 @@ model1.save('/tmp/model1') - denoising_aa = GraphLearner(denoising_g) - model1 = denoising_aa(trainset) - hidset = model(trainset, fieldnames=['hidden']) - model2 = denoising_aa(hidset) + #denoising_aa = GraphLearner(denoising_g) + #model1 = denoising_aa(trainset) + #hidset = model(trainset, fieldnames=['hidden']) + #model2 = denoising_aa(hidset) - f = open('blah', 'w') - for m in model: - m.save(f) - filetensor.write(f, initial_classification_weights) - f.flush() + #f = open('blah', 'w') + #for m in model: + # m.save(f) + #filetensor.write(f, initial_classification_weights) + #f.flush() - deep_sigmoid_net = GraphLearner(deepnetwork_g) - deep_model = deep_sigmoid_net.load('blah') - deep_model.update(trainset) #do some fine tuning + #deep_sigmoid_net = GraphLearner(deepnetwork_g) + #deep_model = deep_sigmoid_net.load('blah') + #deep_model.update(trainset) #do some fine tuning model1_dup = learn_algo('/tmp/model1')