annotate cost.py @ 460:fda72e944104

\/ -> /
author Joseph Turian <turian@iro.umontreal.ca>
date Tue, 07 Oct 2008 23:07:23 -0400
parents d99fefbc9324
children 3daabc7f94ff
rev   line source
413
f63dfb0ac7dc Added cost functions
Joseph Turian <turian@iro.umontreal.ca>
parents:
diff changeset
1 """
f63dfb0ac7dc Added cost functions
Joseph Turian <turian@iro.umontreal.ca>
parents:
diff changeset
2 Cost functions.
439
45879c1ecde7 More doc
Joseph Turian <turian@iro.umontreal.ca>
parents: 434
diff changeset
3
45879c1ecde7 More doc
Joseph Turian <turian@iro.umontreal.ca>
parents: 434
diff changeset
4 @note: All of these functions return one cost per example. So it is your
45879c1ecde7 More doc
Joseph Turian <turian@iro.umontreal.ca>
parents: 434
diff changeset
5 job to perform a tensor.sum over the individual example losses.
413
f63dfb0ac7dc Added cost functions
Joseph Turian <turian@iro.umontreal.ca>
parents:
diff changeset
6 """
f63dfb0ac7dc Added cost functions
Joseph Turian <turian@iro.umontreal.ca>
parents:
diff changeset
7
415
319bf28c2dd5 Small bugfix
Joseph Turian <turian@iro.umontreal.ca>
parents: 413
diff changeset
8 import theano.tensor as T
451
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
9 from xlogx import xlogx
415
319bf28c2dd5 Small bugfix
Joseph Turian <turian@iro.umontreal.ca>
parents: 413
diff changeset
10
413
f63dfb0ac7dc Added cost functions
Joseph Turian <turian@iro.umontreal.ca>
parents:
diff changeset
11 def quadratic(target, output, axis=1):
f63dfb0ac7dc Added cost functions
Joseph Turian <turian@iro.umontreal.ca>
parents:
diff changeset
12 return T.mean(T.sqr(target - output), axis)
f63dfb0ac7dc Added cost functions
Joseph Turian <turian@iro.umontreal.ca>
parents:
diff changeset
13
f63dfb0ac7dc Added cost functions
Joseph Turian <turian@iro.umontreal.ca>
parents:
diff changeset
14 def cross_entropy(target, output, axis=1):
448
0961d4b56ec5 Added some documentation
Joseph Turian <turian@gmail.com>
parents: 439
diff changeset
15 """
0961d4b56ec5 Added some documentation
Joseph Turian <turian@gmail.com>
parents: 439
diff changeset
16 @todo: This is essentially duplicated as nnet_ops.binary_crossentropy
449
2bb67e978c28 updated doc
Joseph Turian <turian@gmail.com>
parents: 448
diff changeset
17 @warning: OUTPUT and TARGET are reversed in nnet_ops.binary_crossentropy
448
0961d4b56ec5 Added some documentation
Joseph Turian <turian@gmail.com>
parents: 439
diff changeset
18 """
434
0f366ecb11ee log2->log in cost
Olivier Breuleux <breuleuo@iro.umontreal.ca>
parents: 415
diff changeset
19 return -T.mean(target * T.log(output) + (1 - target) * T.log(1 - output), axis=axis)
451
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
20
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
21 def KL_divergence(target, output):
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
22 """
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
23 @note: We do not compute the mean, because if target and output have
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
24 different shapes then the result will be garbled.
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
25 """
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
26 return -(target * T.log(output) + (1 - target) * T.log(1 - output)) \
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
27 + (xlogx(target) + xlogx(1 - target))
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
28 # return cross_entropy(target, output, axis) - cross_entropy(target, target, axis)