View source on GitHub |
Computes a hard version of the swish function.
tfm.utils.activations.hard_swish(
features
)
This operation can be used to reduce computational cost and improve quantization for edge devices.
Args | |
---|---|
features
|
A Tensor representing preactivation values.
|
Returns | |
---|---|
The activation value. |