tf.nn.leaky_relu
Compute the Leaky ReLU activation function.
View aliases
Compat aliases for migration
See
Migration guide for
more details.
tf.compat.v1.nn.leaky_relu
tf.nn.leaky_relu(
features, alpha=0.2, name=None
)
Source: Rectifier Nonlinearities Improve Neural Network Acoustic Models.
AL Maas, AY Hannun, AY Ng - Proc. ICML, 2013.
Args |
features
|
A Tensor representing preactivation values. Must be one of
the following types: float16 , float32 , float64 , int32 , int64 .
|
alpha
|
Slope of the activation function at x < 0.
|
name
|
A name for the operation (optional).
|
Returns |
The activation value.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2023-03-17 UTC.
[null,null,["Last updated 2023-03-17 UTC."],[],[]]