tf.keras.ops.leaky_relu
Leaky version of a Rectified Linear Unit activation function.
View aliases
Main aliases
tf.keras.ops.nn.leaky_relu
tf.keras.ops.leaky_relu(
x, negative_slope=0.2
)
It allows a small gradient when the unit is not active, it is defined as:
f(x) = alpha * x for x < 0
or f(x) = x for x >= 0
.
Args |
x
|
Input tensor.
|
negative_slope
|
Slope of the activation function at x < 0.
Defaults to 0.2 .
|
Returns |
A tensor with the same shape as x .
|
Example:
x = np.array([-1., 0., 1.])
x_leaky_relu = keras.ops.leaky_relu(x)
print(x_leaky_relu)
array([-0.2, 0. , 1. ], shape=(3,), dtype=float64)
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2024-06-07 UTC.
[null,null,["Last updated 2024-06-07 UTC."],[],[]]