annotate cost.py @ 497:a272f4cbf004

'x' => 'input' 'y' => 'output'
author Joseph Turian <turian@gmail.com>
date Tue, 28 Oct 2008 12:25:04 -0400
parents f13847478c6d
children
rev   line source
413
f63dfb0ac7dc Added cost functions
Joseph Turian <turian@iro.umontreal.ca>
parents:
diff changeset
1 """
f63dfb0ac7dc Added cost functions
Joseph Turian <turian@iro.umontreal.ca>
parents:
diff changeset
2 Cost functions.
439
45879c1ecde7 More doc
Joseph Turian <turian@iro.umontreal.ca>
parents: 434
diff changeset
3
45879c1ecde7 More doc
Joseph Turian <turian@iro.umontreal.ca>
parents: 434
diff changeset
4 @note: All of these functions return one cost per example. So it is your
45879c1ecde7 More doc
Joseph Turian <turian@iro.umontreal.ca>
parents: 434
diff changeset
5 job to perform a tensor.sum over the individual example losses.
484
3daabc7f94ff Added Yoshua's explanation
Joseph Turian <turian@gmail.com>
parents: 451
diff changeset
6
496
f13847478c6d A few more ideas, in comments
Joseph Turian <turian@gmail.com>
parents: 487
diff changeset
7 @todo: Make a Cost class, with a particular contract.
f13847478c6d A few more ideas, in comments
Joseph Turian <turian@gmail.com>
parents: 487
diff changeset
8
484
3daabc7f94ff Added Yoshua's explanation
Joseph Turian <turian@gmail.com>
parents: 451
diff changeset
9 @todo: It would be nice to implement a hinge loss, with a particular margin.
413
f63dfb0ac7dc Added cost functions
Joseph Turian <turian@iro.umontreal.ca>
parents:
diff changeset
10 """
f63dfb0ac7dc Added cost functions
Joseph Turian <turian@iro.umontreal.ca>
parents:
diff changeset
11
415
319bf28c2dd5 Small bugfix
Joseph Turian <turian@iro.umontreal.ca>
parents: 413
diff changeset
12 import theano.tensor as T
451
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
13 from xlogx import xlogx
415
319bf28c2dd5 Small bugfix
Joseph Turian <turian@iro.umontreal.ca>
parents: 413
diff changeset
14
413
f63dfb0ac7dc Added cost functions
Joseph Turian <turian@iro.umontreal.ca>
parents:
diff changeset
15 def quadratic(target, output, axis=1):
487
94a4c5b7293b DAA code more generic:
Joseph Turian <turian@gmail.com>
parents: 484
diff changeset
16 return T.mean(T.sqr(target - output), axis=axis)
413
f63dfb0ac7dc Added cost functions
Joseph Turian <turian@iro.umontreal.ca>
parents:
diff changeset
17
f63dfb0ac7dc Added cost functions
Joseph Turian <turian@iro.umontreal.ca>
parents:
diff changeset
18 def cross_entropy(target, output, axis=1):
448
0961d4b56ec5 Added some documentation
Joseph Turian <turian@gmail.com>
parents: 439
diff changeset
19 """
0961d4b56ec5 Added some documentation
Joseph Turian <turian@gmail.com>
parents: 439
diff changeset
20 @todo: This is essentially duplicated as nnet_ops.binary_crossentropy
449
2bb67e978c28 updated doc
Joseph Turian <turian@gmail.com>
parents: 448
diff changeset
21 @warning: OUTPUT and TARGET are reversed in nnet_ops.binary_crossentropy
448
0961d4b56ec5 Added some documentation
Joseph Turian <turian@gmail.com>
parents: 439
diff changeset
22 """
434
0f366ecb11ee log2->log in cost
Olivier Breuleux <breuleuo@iro.umontreal.ca>
parents: 415
diff changeset
23 return -T.mean(target * T.log(output) + (1 - target) * T.log(1 - output), axis=axis)
451
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
24
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
25 def KL_divergence(target, output):
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
26 """
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
27 @note: We do not compute the mean, because if target and output have
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
28 different shapes then the result will be garbled.
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
29 """
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
30 return -(target * T.log(output) + (1 - target) * T.log(1 - output)) \
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
31 + (xlogx(target) + xlogx(1 - target))
d99fefbc9324 Added a KL-divergence.
Joseph Turian <turian@gmail.com>
parents: 449
diff changeset
32 # return cross_entropy(target, output, axis) - cross_entropy(target, target, axis)