# HG changeset patch # User Arnaud Bergeron # Date 1273513977 14400 # Node ID b0622f78cfecfc605a1d3674d5aba91342618bd3 # Parent 227ebc0be7aeddff33a988824b77238a1bad3545 Add a small paragraph mentionning the distribution differences and a figure illustrating the difference. diff -r 227ebc0be7ae -r b0622f78cfec writeup/images/nistpvalidstats.png Binary file writeup/images/nistpvalidstats.png has changed diff -r 227ebc0be7ae -r b0622f78cfec writeup/images/nistteststats.png Binary file writeup/images/nistteststats.png has changed diff -r 227ebc0be7ae -r b0622f78cfec writeup/images/nisttrainstats.png Binary file writeup/images/nisttrainstats.png has changed diff -r 227ebc0be7ae -r b0622f78cfec writeup/images/nistvalidstats.png Binary file writeup/images/nistvalidstats.png has changed diff -r 227ebc0be7ae -r b0622f78cfec writeup/techreport.tex --- a/writeup/techreport.tex Mon May 10 13:44:11 2010 -0400 +++ b/writeup/techreport.tex Mon May 10 13:52:57 2010 -0400 @@ -3,6 +3,7 @@ \usepackage{graphicx} \usepackage{times} \usepackage{mlapa} +\usepackage{subfigure} \begin{document} \title{Generating and Exploiting Perturbed Training Data for Deep Architectures} @@ -301,6 +302,18 @@ but no additionnal noise is added to the image, this gives images closer to the NIST dataset. \end{itemize} +We noticed that the distribution of the training sets and the test sets differ. +Since our validation sets are sampled from the training set, they have approximately the same distribution, but the test set has a completely different distribution as illustrated in figure \ref {setsdata}. + +\begin{figure} +\subfigure[NIST training]{\includegraphics[width=0.5\textwidth]{images/nisttrainstats}} +\subfigure[NIST validation]{\includegraphics[width=0.5\textwidth]{images/nistvalidstats}} +\subfigure[NIST test]{\includegraphics[width=0.5\textwidth]{images/nistteststats}} +\subfigure[NISTP validation]{\includegraphics[width=0.5\textwidth]{images/nistpvalidstats}} +\caption{Proportion of each class in some of the data sets} +\label{setsdata} +\end{figure} + \subsection{Models and their Hyperparameters} \subsubsection{Multi-Layer Perceptrons (MLP)}