View source on GitHub |
Hard sigmoid activation function.
tf.keras.activations.hard_sigmoid(
x
)
The hard sigmoid activation is defined as:
0
ifif x < -2.5
1
ifx > 2.5
0.2 * x + 0.5
if-2.5 <= x <= 2.5
It's a faster, piecewise linear approximation of the sigmoid activation.
Args | |
---|---|
x
|
Input tensor. |