annotate xlogx.py @ 501:4fb6f7320518

N-class logistic regression top-layer works
author Joseph Turian <turian@gmail.com>
date Tue, 28 Oct 2008 13:54:01 -0400
parents 117e5b09cf31
children
rev   line source
450
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
1
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
2 import theano
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
3 from theano import tensor, scalar
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
4 import numpy
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
5
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
6 class XlogX(scalar.UnaryScalarOp):
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
7 """
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
8 Compute X * log(X), with special case 0 log(0) = 0.
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
9 """
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
10 @staticmethod
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
11 def st_impl(x):
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
12 if x == 0.0:
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
13 return 0.0
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
14 return x * numpy.log(x)
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
15 def impl(self, x):
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
16 return XlogX.st_impl(x)
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
17 def grad(self, (x,), (gz,)):
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
18 return [gz * (1 + scalar.log(x))]
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
19 def c_code(self, node, name, (x,), (z,), sub):
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
20 if node.inputs[0].type in [scalar.float32, scalar.float64]:
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
21 return """%(z)s =
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
22 %(x)s == 0.0
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
23 ? 0.0
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
24 : %(x)s * log(%(x)s);""" % locals()
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
25 raise NotImplementedError('only floatingpoint is implemented')
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
26 scalar_xlogx = XlogX(scalar.upgrade_to_float, name='scalar_xlogx')
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
27 xlogx = tensor.Elemwise(scalar_xlogx, name='xlogx')
117e5b09cf31 Added an XlogX op.
Joseph Turian <turian@gmail.com>
parents:
diff changeset
28