Thanks for tuning in to Google I/O. View all sessions on demandWatch on demand

tfm.utils.activations.gelu

Gaussian Error Linear Unit.

This is a smoother version of the RELU. Original paper: https://arxiv.org/abs/1606.08415 Args: x: float Tensor to perform activation.

x with the GELU activation applied.