Mercurial > ift6266
comparison writeup/nips2010_submission.tex @ 496:e41007dd40e9
make the reference shorter.
author | Frederic Bastien <nouiz@nouiz.org> |
---|---|
date | Tue, 01 Jun 2010 11:07:25 -0400 |
parents | 5764a2ae1fb5 |
children | 7ff00c27c976 |
comparison
equal
deleted
inserted
replaced
495:5764a2ae1fb5 | 496:e41007dd40e9 |
---|---|
83 Self-taught learning~\citep{RainaR2007} is a paradigm that combines principles | 83 Self-taught learning~\citep{RainaR2007} is a paradigm that combines principles |
84 of semi-supervised and multi-task learning: the learner can exploit examples | 84 of semi-supervised and multi-task learning: the learner can exploit examples |
85 that are unlabeled and/or come from a distribution different from the target | 85 that are unlabeled and/or come from a distribution different from the target |
86 distribution, e.g., from other classes that those of interest. Whereas | 86 distribution, e.g., from other classes that those of interest. Whereas |
87 it has already been shown that deep learners can clearly take advantage of | 87 it has already been shown that deep learners can clearly take advantage of |
88 unsupervised learning and unlabeled examples~\citep{Bengio-2009,WestonJ2008} | 88 unsupervised learning and unlabeled examples~\citep{Bengio-2009,WestonJ2008-small} |
89 and multi-task learning, not much has been done yet to explore the impact | 89 and multi-task learning, not much has been done yet to explore the impact |
90 of {\em out-of-distribution} examples and of the multi-task setting | 90 of {\em out-of-distribution} examples and of the multi-task setting |
91 (but see~\citep{CollobertR2008-short}). In particular the {\em relative | 91 (but see~\citep{CollobertR2008}). In particular the {\em relative |
92 advantage} of deep learning for this settings has not been evaluated. | 92 advantage} of deep learning for this settings has not been evaluated. |
93 | 93 |
94 In this paper we ask the following questions: | 94 In this paper we ask the following questions: |
95 | 95 |
96 %\begin{enumerate} | 96 %\begin{enumerate} |
170 variability of the transformation: $a$ and $d$ $\sim U[1-3 \times | 170 variability of the transformation: $a$ and $d$ $\sim U[1-3 \times |
171 complexity,1+3 \times complexity]$, $b$ and $e$ $\sim[-3 \times complexity,3 | 171 complexity,1+3 \times complexity]$, $b$ and $e$ $\sim[-3 \times complexity,3 |
172 \times complexity]$ and $c$ and $f$ $\sim U[-4 \times complexity, 4 \times | 172 \times complexity]$ and $c$ and $f$ $\sim U[-4 \times complexity, 4 \times |
173 complexity]$.\\ | 173 complexity]$.\\ |
174 {\bf Local Elastic Deformations.} | 174 {\bf Local Elastic Deformations.} |
175 This filter induces a "wiggly" effect in the image, following~\citet{SimardSP03}, | 175 This filter induces a "wiggly" effect in the image, following~\citet{SimardSP03-short}, |
176 which provides more details. | 176 which provides more details. |
177 Two "displacements" fields are generated and applied, for horizontal | 177 Two "displacements" fields are generated and applied, for horizontal |
178 and vertical displacements of pixels. | 178 and vertical displacements of pixels. |
179 To generate a pixel in either field, first a value between -1 and 1 is | 179 To generate a pixel in either field, first a value between -1 and 1 is |
180 chosen from a uniform distribution. Then all the pixels, in both fields, are | 180 chosen from a uniform distribution. Then all the pixels, in both fields, are |
607 %\end{itemize} | 607 %\end{itemize} |
608 | 608 |
609 A Flash demo of the recognizer (where both the MLP and the SDA can be compared) | 609 A Flash demo of the recognizer (where both the MLP and the SDA can be compared) |
610 can be executed on-line at {\tt http://deep.host22.com}. | 610 can be executed on-line at {\tt http://deep.host22.com}. |
611 | 611 |
612 | 612 %\newpage |
613 {\small | 613 { |
614 \bibliography{strings,ml,aigaion,specials} | 614 \bibliography{strings,strings-short,strings-shorter,ift6266_ml,aigaion-shorter,specials} |
615 %\bibliographystyle{plainnat} | 615 %\bibliographystyle{plainnat} |
616 \bibliographystyle{unsrtnat} | 616 \bibliographystyle{unsrtnat} |
617 %\bibliographystyle{apalike} | 617 %\bibliographystyle{apalike} |
618 } | 618 } |
619 | 619 |