Neural Network
Home
Tutorials
Activations
Download
Linear
Sigmoid
Tanh
ReLU
ReLU6
LeakyReLU
ELU
Softplus
Softmax
Tanh
t
a
n
h
(
x
)
=
e
x
−
e
−
x
e
x
+
e
−
x
\rm {tanh}(x) = \frac {e^x - e^{-x}} {e^x + e^{-x}}
tanh
(
x
)
=
e
x
+
e
−
x
e
x
−
e
−
x
t
a
n
h
′
(
x
)
=
1
−
t
a
n
h
2
(
x
)
\rm {tanh}^{\prime}(x) = 1-\rm {tanh}^2(x)
tanh
′
(
x
)
=
1
−
tanh
2
(
x
)