View source on GitHub |
Swish (or Silu) activation function.
tf.keras.activations.silu(
x
)
It is defined as: swish(x) = x * sigmoid(x)
.
The Swish (or Silu) activation function is a smooth, non-monotonic function that is unbounded above and bounded below.
Args | |
---|---|
x
|
Input tensor. |