Mercurial > pylearn
changeset 1366:f945ed016c68
comment by YB
author | Yoshua Bengio <bengioy@iro.umontreal.ca> |
---|---|
date | Fri, 12 Nov 2010 13:49:13 -0500 |
parents | 049b99f4b323 |
children | 9474fb4ad109 |
files | doc/v2_planning/datalearn.txt |
diffstat | 1 files changed, 8 insertions(+), 0 deletions(-) [+] |
line wrap: on
line diff
--- a/doc/v2_planning/datalearn.txt Fri Nov 12 11:49:00 2010 -0500 +++ b/doc/v2_planning/datalearn.txt Fri Nov 12 13:49:13 2010 -0500 @@ -303,6 +303,12 @@ hyper-parameters for which you need to recompile the thenao function and can not be just parameters ( so we would have yet another category ?). +Yoshua's comments on RP's comments: I don't understand why we would +need to create these types. Isn't it just a matter for the programmer +to decide what are the inputs of the compiled function, and which +are possibly constant (e.g. holding some hyper-parameters constant +for a while)? + James: Another syntactic option for iterating over datasets is .. code-block:: python @@ -315,6 +321,8 @@ numeric_iterator function can also specify what compile mode to use, any givens you might want to apply, etc. +Yoshua's comment to James' comment: I like that approach. + OD comments: Would there also be some kind of function cache to avoid compiling the same function again if we re-iterate on the same dataset with the same arguments? Maybe a more generic issue is: would there be a way for