comparison writeup/techreport.tex @ 452:b0622f78cfec

Add a small paragraph mentionning the distribution differences and a figure illustrating the difference.
author Arnaud Bergeron <abergeron@gmail.com>
date Mon, 10 May 2010 13:52:57 -0400
parents 18841eeb433f
children c0f738f0cef0
comparison
equal deleted inserted replaced
451:227ebc0be7ae 452:b0622f78cfec
1 \documentclass[12pt,letterpaper]{article} 1 \documentclass[12pt,letterpaper]{article}
2 \usepackage[utf8]{inputenc} 2 \usepackage[utf8]{inputenc}
3 \usepackage{graphicx} 3 \usepackage{graphicx}
4 \usepackage{times} 4 \usepackage{times}
5 \usepackage{mlapa} 5 \usepackage{mlapa}
6 \usepackage{subfigure}
6 7
7 \begin{document} 8 \begin{document}
8 \title{Generating and Exploiting Perturbed Training Data for Deep Architectures} 9 \title{Generating and Exploiting Perturbed Training Data for Deep Architectures}
9 \author{The IFT6266 Gang} 10 \author{The IFT6266 Gang}
10 \date{April 2010, Technical Report, Dept. IRO, U. Montreal} 11 \date{April 2010, Technical Report, Dept. IRO, U. Montreal}
299 \item {\bf NISTP} {\em ne pas utiliser PNIST mais NISTP, pour rester politically correct...} 300 \item {\bf NISTP} {\em ne pas utiliser PNIST mais NISTP, pour rester politically correct...}
300 NISTP is equivalent to P07 except that we only apply transformations from slant to pinch. Therefore, the character is transformed 301 NISTP is equivalent to P07 except that we only apply transformations from slant to pinch. Therefore, the character is transformed
301 but no additionnal noise is added to the image, this gives images closer to the NIST dataset. 302 but no additionnal noise is added to the image, this gives images closer to the NIST dataset.
302 \end{itemize} 303 \end{itemize}
303 304
305 We noticed that the distribution of the training sets and the test sets differ.
306 Since our validation sets are sampled from the training set, they have approximately the same distribution, but the test set has a completely different distribution as illustrated in figure \ref {setsdata}.
307
308 \begin{figure}
309 \subfigure[NIST training]{\includegraphics[width=0.5\textwidth]{images/nisttrainstats}}
310 \subfigure[NIST validation]{\includegraphics[width=0.5\textwidth]{images/nistvalidstats}}
311 \subfigure[NIST test]{\includegraphics[width=0.5\textwidth]{images/nistteststats}}
312 \subfigure[NISTP validation]{\includegraphics[width=0.5\textwidth]{images/nistpvalidstats}}
313 \caption{Proportion of each class in some of the data sets}
314 \label{setsdata}
315 \end{figure}
316
304 \subsection{Models and their Hyperparameters} 317 \subsection{Models and their Hyperparameters}
305 318
306 \subsubsection{Multi-Layer Perceptrons (MLP)} 319 \subsubsection{Multi-Layer Perceptrons (MLP)}
307 320
308 An MLP is a family of functions that are described by stacking layers of of a function similar to 321 An MLP is a family of functions that are described by stacking layers of of a function similar to