View source on GitHub |
Hard SiLU activation function, also known as Hard Swish.
tf.keras.activations.hard_silu(
x
)
It is defined as:
0
ifif x < -3
x
ifx > 3
x * (x + 3) / 6
if-3 <= x <= 3
It's a faster, piecewise linear approximation of the silu activation.
Args | |
---|---|
x
|
Input tensor. |