tfa.activations.gelu

Gaussian Error Linear Unit.

Computes gaussian error linear:

gelu(x)=xΦ(x),

where

Φ(x)=12[1+erf(x2)]$

when approximate is False; or

Φ(x)=x2[1+tanh(2π(x+0.044715x3))]

when approximate is True.

See Gaussian Error Linear Units (GELUs) and BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.

Consider using tf.nn.gelu instead. Note that the default of approximate changed to False in tf.nn.gelu.

Usage:

x = tf.constant([-1.0, 0.0, 1.0])
tfa.activations.gelu(x, approximate=False)
<tf.Tensor: shape=(3,), dtype=float32, numpy=array([-0.15865529,  0.        ,  0.8413447 ], dtype=float32)>
tfa.activations.gelu(x, approximate=True)
<tf.Tensor: shape=(3,), dtype=float32, numpy=array([-0.15880796,  0.        ,  0.841192  ], dtype=float32)>

x A Tensor. Must be one of the following types: float16, float32, float64.
approximate bool, whether to enable approximation.

A Tensor. Has the same type as x.