Neural Network
Home
Tutorials
Activations
Download
Linear
Sigmoid
Tanh
ReLU
ReLU6
LeakyReLU
ELU
Softplus
Softmax
ELU
α
\alpha
α
1
f
(
x
)
=
{
α
(
e
x
−
1
)
x
<
0
x
otherwise
f(x) = \begin{cases} \alpha (e^x-1) & x < 0 \\ x & \text{otherwise}\end{cases}
f
(
x
)
=
{
α
(
e
x
−
1
)
x
x
<
0
otherwise
α
\alpha
α
1
f
′
(
x
)
=
{
α
e
x
x
<
0
1
x
>
0
f^{\prime}(x) = \begin{cases} \alpha e^x & x < 0 \\ 1 & x > 0\end{cases}
f
′
(
x
)
=
{
α
e
x
1
x
<
0
x
>
0