tf.keras.activations.gelu
Gaussian error linear unit (GELU) activation function.
tf.keras.activations.gelu(
x, approximate=False
)
The Gaussian error linear unit (GELU) is defined as:
gelu(x) = x * P(X <= x)
where P(X) ~ N(0, 1)
,
i.e. gelu(x) = 0.5 * x * (1 + erf(x / sqrt(2)))
.
GELU weights inputs by their value, rather than gating
inputs by their sign as in ReLU.
Args |
x
|
Input tensor.
|
approximate
|
A bool , whether to enable approximation.
|
Reference:
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2024-06-07 UTC.
[null,null,["Last updated 2024-06-07 UTC."],[],[]]