Neural Network
Home
Tutorials
Activations
Download
Linear
Sigmoid
Tanh
ReLU
ReLU6
LeakyReLU
ELU
Softplus
Softmax
ReLU6
f
(
x
)
=
m
i
n
(
m
a
x
(
0
,
x
)
,
6
)
f(x) = \rm {min}(\rm {max}(0, x), 6)
f
(
x
)
=
min
(
max
(
0
,
x
)
,
6
)
f
′
(
x
)
=
{
0
x
<
0
or
x
>
6
1
otherwise
f^{\prime}(x) = \begin{cases} 0 & x < 0 \text{ or } x > 6 \\ 1 & \text{otherwise}\end{cases}
f
′
(
x
)
=
{
0
1
x
<
0
or
x
>
6
otherwise