# HG changeset patch # User Joseph Turian # Date 1243373983 14400 # Node ID e915f5c9bb21775d5110f059da727075f3b9e310 # Parent df3aef87d8d2adbbe1dc28e66c7ba92f0d8e9012 Added more descriptive comments to crossentropy and KL divergence. diff -r df3aef87d8d2 -r e915f5c9bb21 pylearn/algorithms/cost.py --- a/pylearn/algorithms/cost.py Mon May 25 23:13:56 2009 -0400 +++ b/pylearn/algorithms/cost.py Tue May 26 17:39:43 2009 -0400 @@ -17,6 +17,8 @@ def cross_entropy(target, output, mean_axis=0, sum_axis=1): """ + This is the cross-entropy over a binomial event, in which each dimension + is an independent binomial trial. @todo: This is essentially duplicated as nnet_ops.binary_crossentropy @warning: OUTPUT and TARGET are reversed in nnet_ops.binary_crossentropy """ @@ -25,6 +27,8 @@ def KL_divergence(target, output): """ + This is a KL divergence over a binomial event, in which each dimension + is an independent binomial trial. @note: We do not compute the mean, because if target and output have different shapes then the result will be garbled. """