View source on GitHub |
Gaussian Error Linear Unit.
tfm.utils.activations.gelu(
x
)
This is a smoother version of the RELU. Original paper: https://arxiv.org/abs/1606.08415 Args: x: float Tensor to perform activation.
Returns | |
---|---|
x with the GELU activation applied.
|