Mercurial > ift6266
comparison deep/stacked_dae/v_sylvain/nist_sda.py @ 307:ed0443f7aad4
Ajout d'une protection supplementaire (un peu de paranoia) ainsi que commentaires
author | SylvainPL <sylvain.pannetier.lebeuf@umontreal.ca> |
---|---|
date | Wed, 31 Mar 2010 21:03:41 -0400 |
parents | 698313f8f6e6 |
children | bd6085d77706 |
comparison
equal
deleted
inserted
replaced
306:a78dbbc61f37 | 307:ed0443f7aad4 |
---|---|
73 if state['pretrain_choice'] == 0: | 73 if state['pretrain_choice'] == 0: |
74 print('\n\tpretraining with NIST\n') | 74 print('\n\tpretraining with NIST\n') |
75 optimizer.pretrain(datasets.nist_all()) | 75 optimizer.pretrain(datasets.nist_all()) |
76 elif state['pretrain_choice'] == 1: | 76 elif state['pretrain_choice'] == 1: |
77 #To know how many file will be used during pretraining | 77 #To know how many file will be used during pretraining |
78 nb_file = state['pretraining_epochs_per_layer'] | 78 nb_file = int(state['pretraining_epochs_per_layer']) |
79 state['pretraining_epochs_per_layer'] = 1 #Only 1 time over the dataset | 79 state['pretraining_epochs_per_layer'] = 1 #Only 1 time over the dataset |
80 if nb_file >=100: | 80 if nb_file >=100: |
81 sys.exit("The code does not support this much pretraining epoch (99 max with P07).\n"+ | 81 sys.exit("The code does not support this much pretraining epoch (99 max with P07).\n"+ |
82 "You have to correct the code (and be patient, P07 is huge !!)\n"+ | 82 "You have to correct the code (and be patient, P07 is huge !!)\n"+ |
83 "or reduce the number of pretraining epoch to run the code (better idea).\n") | 83 "or reduce the number of pretraining epoch to run the code (better idea).\n") |