Mercurial > pylearn
changeset 1029:0ddb5f637ce3
Merged
author | Olivier Delalleau <delallea@iro> |
---|---|
date | Mon, 06 Sep 2010 20:41:58 -0400 |
parents | c6a74b24330b (current diff) a1b6ccd5b6dc (diff) |
children | a154c9b68239 |
files | |
diffstat | 3 files changed, 20 insertions(+), 5 deletions(-) [+] |
line wrap: on
line diff
--- a/doc/v2_planning/committees.txt Mon Sep 06 20:41:51 2010 -0400 +++ b/doc/v2_planning/committees.txt Mon Sep 06 20:41:58 2010 -0400 @@ -2,11 +2,11 @@ * Existing Python ML libraries investigation: GD, DWF, IG, DE * Dataset interface: DE, OB, OD, AB, PV -* Learners: AB, PL, GM, IG, RP +* Learners: AB, PL, GM, IG, RP, NB * Optimization: JB, PL, OD * Inference/sampling: JB, GD, AC * Job management, analysis, metrics, costs, visualization: GD, FS, PL, XM -* Formulas/tags: FB, NB, RP, AC, OB +* Formulas/tags: FB, RP, AC, OB * Coding style: DE, OD, DWF, FB Issues to be tackled in the future:
--- a/doc/v2_planning/optimization.txt Mon Sep 06 20:41:51 2010 -0400 +++ b/doc/v2_planning/optimization.txt Mon Sep 06 20:41:58 2010 -0400 @@ -30,8 +30,12 @@ - sgd with annealing schedule - TONGA - James Marten's Hessian-free + - Conjugate gradients, batch and (large) mini-batch [that is also what Marten's thing does] Do we need anything to make batch algos work better with Pylearn things? - - conjugate methods? - - L-BFGS? + - conjugate methods? yes + - L-BFGS? maybe, when needed + + +
--- a/doc/v2_planning/sampler.txt Mon Sep 06 20:41:51 2010 -0400 +++ b/doc/v2_planning/sampler.txt Mon Sep 06 20:41:58 2010 -0400 @@ -1,3 +1,6 @@ + +Inference / Sampling committee: JB, GD, AC + OVERVIEW ======== @@ -36,4 +39,12 @@ * Annealing * Parallel Tempering, Tempered Transitions, Simulated Tempering * Nested Sampling (?) -* Hamiltonian Monte Carlo +* Hamiltonian Monte Carlo --> or is it Hybrid Monte Carlo? + +3. USAGE PATTERNS +================= + +* MCMC methods have a usage pattern that is quite different from the kind of univariate sampling methods +needed for nice-and-easy parametric families. + +