Mercurial > pylearn
view cost.py @ 450:117e5b09cf31
Added an XlogX op.
author | Joseph Turian <turian@gmail.com> |
---|---|
date | Thu, 04 Sep 2008 14:46:17 -0400 |
parents | 2bb67e978c28 |
children | d99fefbc9324 |
line wrap: on
line source
""" Cost functions. @note: All of these functions return one cost per example. So it is your job to perform a tensor.sum over the individual example losses. """ import theano.tensor as T def quadratic(target, output, axis=1): return T.mean(T.sqr(target - output), axis) def cross_entropy(target, output, axis=1): """ @todo: This is essentially duplicated as nnet_ops.binary_crossentropy @warning: OUTPUT and TARGET are reversed in nnet_ops.binary_crossentropy """ return -T.mean(target * T.log(output) + (1 - target) * T.log(1 - output), axis=axis)