Compute the Leaky ReLU activation function.
View aliases
Compat aliases for migration
See Migration guide for more details.
tf.compat.v1.nn.leaky_relu
, `tf.compat.v2.nn.leaky_relu`
tf.nn.leaky_relu(
features, alpha=0.2, name=None
)
Args | |
---|---|
features
|
A Tensor representing preactivation values. Must be one of
the following types: float16 , float32 , float64 , int32 , int64 .
|
alpha
|
Slope of the activation function at x < 0. |
name
|
A name for the operation (optional). |
Returns | |
---|---|
The activation value. |