Neural Network
Home
Tutorials
Activations
Download
Linear
Sigmoid
Tanh
ReLU
ReLU6
LeakyReLU
ELU
Softplus
Softmax
LeakyReLU
α
\alpha
α
1
f
(
x
)
=
{
α
x
x
<
0
x
otherwise
f(x) = \begin{cases} \alpha x & x < 0 \\ x & \text{otherwise}\end{cases}
f
(
x
)
=
{
αx
x
x
<
0
otherwise
α
\alpha
α
1
f
′
(
x
)
=
{
α
x
<
0
1
x
>
0
f^{\prime}(x) = \begin{cases} \alpha & x < 0 \\ 1 & x > 0\end{cases}
f
′
(
x
)
=
{
α
1
x
<
0
x
>
0