Leaky version of a Rectified Linear Unit.
Inherits From: Layer
View aliases
Compat aliases for migration
See Migration guide for more details.
tf.compat.v1.keras.layers.LeakyReLU
, `tf.compat.v2.keras.layers.LeakyReLU`
tf.keras.layers.LeakyReLU(
alpha=0.3, **kwargs
)
It allows a small gradient when the unit is not active:
f(x) = alpha * x for x < 0
,
f(x) = x for x >= 0
.
Input shape:
Arbitrary. Use the keyword argument input_shape
(tuple of integers, does not include the samples axis)
when using this layer as the first layer in a model.
Output shape:
Same shape as the input.
Arguments | |
---|---|
alpha
|
Float >= 0. Negative slope coefficient. |