changeset 452:b0622f78cfec

Add a small paragraph mentionning the distribution differences and a figure illustrating the difference.
author Arnaud Bergeron <abergeron@gmail.com>
date Mon, 10 May 2010 13:52:57 -0400
parents 227ebc0be7ae
children 3140f678dad2
files writeup/images/nistpvalidstats.png writeup/images/nistteststats.png writeup/images/nisttrainstats.png writeup/images/nistvalidstats.png writeup/techreport.tex
diffstat 5 files changed, 13 insertions(+), 0 deletions(-) [+]
line wrap: on
line diff
Binary file writeup/images/nistpvalidstats.png has changed
Binary file writeup/images/nistteststats.png has changed
Binary file writeup/images/nisttrainstats.png has changed
Binary file writeup/images/nistvalidstats.png has changed
--- a/writeup/techreport.tex	Mon May 10 13:44:11 2010 -0400
+++ b/writeup/techreport.tex	Mon May 10 13:52:57 2010 -0400
@@ -3,6 +3,7 @@
 \usepackage{graphicx}
 \usepackage{times}
 \usepackage{mlapa}
+\usepackage{subfigure}
 
 \begin{document}
 \title{Generating and Exploiting Perturbed Training Data for Deep Architectures}
@@ -301,6 +302,18 @@
 but no additionnal noise is added to the image, this gives images closer to the NIST dataset.
 \end{itemize}
 
+We noticed that the distribution of the training sets and the test sets differ.
+Since our validation sets are sampled from the training set, they have approximately the same distribution, but the test set has a completely different distribution as illustrated in figure \ref {setsdata}.
+
+\begin{figure}
+\subfigure[NIST training]{\includegraphics[width=0.5\textwidth]{images/nisttrainstats}}
+\subfigure[NIST validation]{\includegraphics[width=0.5\textwidth]{images/nistvalidstats}}
+\subfigure[NIST test]{\includegraphics[width=0.5\textwidth]{images/nistteststats}}
+\subfigure[NISTP validation]{\includegraphics[width=0.5\textwidth]{images/nistpvalidstats}}
+\caption{Proportion of each class in some of the data sets}
+\label{setsdata}
+\end{figure}
+
 \subsection{Models and their Hyperparameters}
 
 \subsubsection{Multi-Layer Perceptrons (MLP)}