Neural Network
Home
Tutorials
Activations
Download
Linear
Sigmoid
Tanh
ReLU
ReLU6
LeakyReLU
ELU
Softplus
Softmax
Sigmoid
σ
(
x
)
=
1
1
+
e
−
x
\sigma(x) = \frac 1 {1 + e^{-x}}
σ
(
x
)
=
1
+
e
−
x
1
σ
′
(
x
)
=
σ
(
x
)
(
1
−
σ
(
x
)
)
\sigma^{\prime}(x) = \sigma(x)(1-\sigma(x))
σ
′
(
x
)
=
σ
(
x
)
(
1
−
σ
(
x
))