Mercurial > ift6266
view datasets/nist.py @ 185:b9ea8e2d071a
Enlevé ce qui concernait la réutilisation de résultats de préentraînement (trop compliqué pour peu de bénéfice: c'est le finetuning qui est vraiment long
author | fsavard |
---|---|
date | Fri, 26 Feb 2010 17:45:52 -0500 |
parents | e3de934a98b6 |
children |
line wrap: on
line source
__all__ = ['nist_digits', 'nist_lower', 'nist_upper', 'nist_all'] from ftfile import FTDataSet PATH = '/data/lisa/data/nist/by_class/' nist_digits = FTDataSet(train_data = [PATH+'digits/digits_train_data.ft'], train_lbl = [PATH+'digits/digits_train_labels.ft'], test_data = [PATH+'digits/digits_test_data.ft'], test_lbl = [PATH+'digits/digits_test_labels.ft']) nist_lower = FTDataSet(train_data = [PATH+'lower/lower_train_data.ft'], train_lbl = [PATH+'lower/lower_train_labels.ft'], test_data = [PATH+'lower/lower_test_data.ft'], test_lbl = [PATH+'lower/lower_test_labels.ft']) nist_upper = FTDataSet(train_data = [PATH+'upper/upper_train_data.ft'], train_lbl = [PATH+'upper/upper_train_labels.ft'], test_data = [PATH+'upper/upper_test_data.ft'], test_lbl = [PATH+'upper/upper_test_labels.ft']) nist_all = FTDataSet(train_data = [PATH+'all/all_train_data.ft'], train_lbl = [PATH+'all/all_train_labels.ft'], test_data = [PATH+'all/all_test_data.ft'], test_lbl = [PATH+'all/all_test_labels.ft'])