Neural Network
Home
Tutorials
Activations
Download
Linear
Sigmoid
Tanh
ReLU
ReLU6
LeakyReLU
ELU
Softplus
Softmax
ReLU
f
(
x
)
=
m
a
x
(
0
,
x
)
f(x) = \rm {max}(0, x)
f
(
x
)
=
max
(
0
,
x
)
f
′
(
x
)
=
{
0
x
<
0
1
x
>
0
f^{\prime}(x) = \begin{cases} 0 & x < 0 \\ 1 & x > 0\end{cases}
f
′
(
x
)
=
{
0
1
x
<
0
x
>
0