Mercurial > pylearn
annotate pylearn/formulas/activations.py @ 1315:f21693eecec7
added several activation formulas
author | Olivier Breuleux <breuleux@gmail.com> |
---|---|
date | Thu, 07 Oct 2010 16:42:12 -0400 |
parents | 970082c8e9de |
children | 00116be92710 |
rev | line source |
---|---|
1308
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
1 """ |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
2 Activation function for artificial neural units. |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
3 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
4 """ |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
5 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
6 __authors__ = "Razvan Pascanu, .." |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
7 __copyright__ = "(c) 2010, Universite de Montreal" |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
8 __license__ = "3-clause BSD License" |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
9 __contact__ = "Razvan Pascanu <r.pascanu@gmail.com>" |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
10 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
11 import theano |
1312 | 12 from theano import tensor |
13 | |
1308
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
14 import tags |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
15 |
1315
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
16 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
17 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
18 @tags.tags('activation', 'unary', |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
19 'sigmoid', 'logistic', |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
20 'non-negative', 'increasing') |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
21 def sigmoid(x): |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
22 """ |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
23 Return a symbolic variable representing the sigmoid (logistic) |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
24 function of the input x. |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
25 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
26 .. math:: |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
27 \\textrm{sigmoid}(x) = \\frac{1}{1 + e^x} |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
28 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
29 The image of :math:`\\textrm{sigmoid}(x)` is the open interval (0, |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
30 1), *in theory*. *In practice*, due to rounding errors in floating |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
31 point representations, :math:`\\textrm{sigmoid}(x)` will lie in the |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
32 closed range [0, 1]. |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
33 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
34 :param x: tensor-like (a Theano variable with type theano.Tensor, |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
35 or a value that can be converted to one) :math:`\in |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
36 \mathbb{R}^n` |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
37 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
38 :return: a Theano variable with the same shape as the input, where |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
39 the sigmoid function is mapped to each element of the |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
40 input x. |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
41 """ |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
42 return theano.tensor.nnet.sigmoid(x) |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
43 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
44 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
45 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
46 @tags.tags('activation', 'unary', |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
47 'tanh', 'hyperbolic tangent', |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
48 'odd', 'increasing') |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
49 def tanh(x): |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
50 """ |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
51 Return a symbolic variable representing the tanh (hyperbolic |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
52 tangent) of the input x. |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
53 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
54 .. math:: |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
55 \\textrm{tanh}(x) = \\frac{e^{2x} - 1}{e^{2x} + 1} |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
56 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
57 The image of :math:`\\textrm{tanh}(x)` is the open interval (-1, |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
58 1), *in theory*. *In practice*, due to rounding errors in floating |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
59 point representations, :math:`\\textrm{tanh}(x)` will lie in the |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
60 closed range [-1, 1]. |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
61 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
62 :param x: tensor-like (a Theano variable with type theano.Tensor, |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
63 or a value that can be converted to one) :math:`\in |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
64 \mathbb{R}^n` |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
65 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
66 :return: a Theano variable with the same shape as the input, where |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
67 the tanh function is mapped to each element of the input |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
68 x. |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
69 """ |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
70 return theano.tensor.tanh(x) |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
71 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
72 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
73 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
74 @tags.tags('activation', 'unary', |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
75 'tanh', 'hyperbolic tangent', 'normalized', |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
76 'odd', 'increasing') |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
77 def tanh_normalized(x): |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
78 """ |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
79 Return a symbolic variable representing a normalized tanh |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
80 (hyperbolic tangent) of the input x. |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
81 TODO: where does 1.759 come from? why is it normalized like that? |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
82 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
83 .. math:: |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
84 \\textrm{tanh\_normalized}(x) = 1.759\\textrm{ tanh}\left(\\frac{2x}{3}\\right) |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
85 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
86 The image of :math:`\\textrm{tanh\_normalized}(x)` is the open |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
87 interval (-1.759, 1.759), *in theory*. *In practice*, due to |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
88 rounding errors in floating point representations, |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
89 :math:`\\textrm{tanh\_normalized}(x)` will lie in the approximative |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
90 closed range [-1.759, 1.759]. The exact bound depends on the |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
91 precision of the floating point representation. |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
92 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
93 :param x: tensor-like (a Theano variable with type theano.Tensor, |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
94 or a value that can be converted to one) :math:`\in |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
95 \mathbb{R}^n` |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
96 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
97 :return: a Theano variable with the same shape as the input, where |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
98 the tanh\_normalized function is mapped to each element of |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
99 the input x. |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
100 """ |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
101 return 1.759*theano.tensor.tanh(0.6666*x) |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
102 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
103 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
104 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
105 @tags.tags('activation', 'unary', |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
106 'abs_tanh', 'abs', 'tanh', 'hyperbolic tangent', |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
107 'non-negative', 'even') |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
108 def abs_tanh(x): |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
109 """ |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
110 Return a symbolic variable representing the absolute value of the |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
111 hyperbolic tangent of x. |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
112 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
113 .. math:: |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
114 \\textrm{abs\_tanh}(x) = |\\textrm{tanh}(x)| |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
115 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
116 The image of :math:`\\textrm{abs\_tanh}(x)` is the interval [0, 1), |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
117 *in theory*. *In practice*, due to rounding errors in floating |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
118 point representations, :math:`\\textrm{abs\_tanh}(x)` will lie in |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
119 the range [0, 1]. |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
120 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
121 :param x: tensor-like (a Theano variable with type theano.Tensor, |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
122 or a value that can be converted to one) :math:`\in |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
123 \mathbb{R}^n` |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
124 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
125 :return: a Theano variable with the same shape as the input, where |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
126 the abs_tanh function is mapped to each element of the |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
127 input x. |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
128 """ |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
129 return theano.tensor.abs_(theano.tensor.tanh(x)) |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
130 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
131 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
132 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
133 @tags.tags('activation', 'unary', |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
134 'abs_tanh', 'abs', 'tanh', 'hyperbolic tangent', 'normalized', |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
135 'non-negative', 'even') |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
136 def abs_tanh_normalized(x): |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
137 """ |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
138 Return a symbolic variable representing the absolute value of a |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
139 normalized tanh (hyperbolic tangent) of the input x. |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
140 TODO: where does 1.759 come from? why is it normalized like that? |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
141 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
142 .. math:: |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
143 \\textrm{abs\_tanh\_normalized}(x) = \left|1.759\\textrm{ tanh}\left(\\frac{2x}{3}\\right)\\right| |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
144 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
145 The image of :math:`\\textrm{abs\_tanh\_normalized}(x)` is the range |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
146 [0, 1.759), *in theory*. *In practice*, due to rounding errors in |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
147 floating point representations, |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
148 :math:`\\textrm{abs\_tanh\_normalized}(x)` will lie in the |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
149 approximative closed range [0, 1.759]. The exact upper bound |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
150 depends on the precision of the floating point representation. |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
151 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
152 :param x: tensor-like (a Theano variable with type theano.Tensor, |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
153 or a value that can be converted to one) :math:`\in |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
154 \mathbb{R}^n` |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
155 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
156 :return: a Theano variable with the same shape as the input, where |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
157 the abs_tanh_normalized function is mapped to each |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
158 element of the input x. |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
159 """ |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
160 return theano.tensor.abs_(1.759*theano.tensor.tanh(0.6666*x)) |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
161 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
162 |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
163 |
1308
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
164 @tags.tags('activation','softsign') |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
165 def softsign_act(input): |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
166 """ |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
167 Returns a symbolic variable that computes the softsign of ``input``. |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
168 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
169 .. math:: |
1315
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
170 f(input) = \\frac{input}{1.0 + |input|} |
1308
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
171 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
172 :type input: tensor-like |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
173 :param input: input tensor to which softsign should be applied |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
174 :rtype: Theano variable |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
175 :return: tensor obtained after applying the softsign function |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
176 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
177 """ |
1312 | 178 return input/(1.0 + tensor.abs_(input)) |
1308
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
179 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
180 @tags.tags('activation','softsign','abs') |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
181 def abssoftsign_act(input): |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
182 """ |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
183 Returns a symbolic variable that computes the absolute value of the |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
184 softsign function on the input tensor ``input``. |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
185 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
186 .. math:: |
1315
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
187 f(input) = \left| \\frac{input}{1.0 +|input|} \\right| |
1308
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
188 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
189 :type input: tensor-like |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
190 :param input: input tensor to which softsign should be applied |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
191 :rtype: Tensor variable |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
192 :return: tensor obtained by taking the absolute value of softsign |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
193 of the input |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
194 """ |
1312 | 195 return tensor.abs_(input)/(1.0 + tensor.abs_(input)) |
1308
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
196 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
197 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
198 @tags.tags('activation','rectifier') |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
199 def rectifier_act(input): |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
200 """ |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
201 Returns a symbolic variable that computes the value of the ``input`` if |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
202 and only if it is positive, 0 otherwise. |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
203 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
204 .. math:: |
1315
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
205 f(input) = \left \lbrace \\begin{array}{l} |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
206 input \quad \\text{ if } input > 0 \\ |
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
207 0 \quad \\text{ else } |
1308
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
208 \end{array} |
1315
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
209 \\right \} |
1308
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
210 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
211 :type input: tensor-like |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
212 :param input: input tensor to which the rectifier activation function |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
213 will be applied |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
214 :rtype: Tensor variable |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
215 :return: always positive tensor which equals with the input if it is also |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
216 positive or to 0 otherwise |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
217 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
218 """ |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
219 return input*(input>=0) |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
220 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
221 @tags.tags('activation','softplus') |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
222 def softplus_act(input): |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
223 """ |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
224 Returns a symbolic variable that computes the softplus of ``input``. |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
225 Note : (TODO) rescale in order to have a steady state regime close to 0 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
226 at initialization. |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
227 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
228 .. math:: |
1315
f21693eecec7
added several activation formulas
Olivier Breuleux <breuleux@gmail.com>
parents:
1312
diff
changeset
|
229 f(input) = ln \left( 1 + e^{input} \\right) |
1308
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
230 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
231 :type input: tensor-like |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
232 :param input: input tensor to which the softplus should be applied |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
233 :rtype: Theano variable |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
234 :return: tensor obtained by applying softsign on the input |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
235 """ |
1312 | 236 return tensor.nnet.softplus(input) |
1308
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
237 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
238 @tags.tags('activation','abs') |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
239 def abs_act(input): |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
240 """ |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
241 Returns the symbolic variable that represents the absolute value of |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
242 ``input``. |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
243 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
244 .. math:: |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
245 f(input) = |input| |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
246 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
247 :type input: tensor-like |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
248 :param input: input tensor |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
249 :rtype: Theano variable |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
250 :return: tensor that represents the absolute value of the input |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
251 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
252 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
253 """ |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
254 return theano.tensor.abs_(input) |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
255 |
d5e536338b69
5 activation functions added to formulas
Razvan Pascanu <r.pascanu@gmail.com>
parents:
diff
changeset
|
256 |