diff writeup/nips2010_submission.tex @ 496:e41007dd40e9

make the reference shorter.
author Frederic Bastien <nouiz@nouiz.org>
date Tue, 01 Jun 2010 11:07:25 -0400
parents 5764a2ae1fb5
children 7ff00c27c976
line wrap: on
line diff
--- a/writeup/nips2010_submission.tex	Tue Jun 01 11:02:10 2010 -0400
+++ b/writeup/nips2010_submission.tex	Tue Jun 01 11:07:25 2010 -0400
@@ -85,10 +85,10 @@
 that are unlabeled and/or come from a distribution different from the target
 distribution, e.g., from other classes that those of interest. Whereas
 it has already been shown that deep learners can clearly take advantage of
-unsupervised learning and unlabeled examples~\citep{Bengio-2009,WestonJ2008}
+unsupervised learning and unlabeled examples~\citep{Bengio-2009,WestonJ2008-small}
 and multi-task learning, not much has been done yet to explore the impact
 of {\em out-of-distribution} examples and of the multi-task setting
-(but see~\citep{CollobertR2008-short}). In particular the {\em relative
+(but see~\citep{CollobertR2008}). In particular the {\em relative
 advantage} of deep learning for this settings has not been evaluated.
 
 In this paper we ask the following questions:
@@ -172,7 +172,7 @@
 \times complexity]$ and $c$ and $f$ $\sim U[-4 \times complexity, 4 \times
 complexity]$.\\
 {\bf Local Elastic Deformations.}
-This filter induces a "wiggly" effect in the image, following~\citet{SimardSP03},
+This filter induces a "wiggly" effect in the image, following~\citet{SimardSP03-short},
 which provides more details. 
 Two "displacements" fields are generated and applied, for horizontal
 and vertical displacements of pixels. 
@@ -609,9 +609,9 @@
 A Flash demo of the recognizer (where both the MLP and the SDA can be compared) 
 can be executed on-line at {\tt http://deep.host22.com}.
 
-
-{\small
-\bibliography{strings,ml,aigaion,specials}
+%\newpage
+{
+\bibliography{strings,strings-short,strings-shorter,ift6266_ml,aigaion-shorter,specials}
 %\bibliographystyle{plainnat}
 \bibliographystyle{unsrtnat}
 %\bibliographystyle{apalike}