tf.keras.layers.ReLU

Rectified Linear Unit activation function.

Inherits From: Layer

Compat aliases for migration

See Migration guide for more details.

tf.compat.v1.keras.layers.ReLU, `tf.compat.v2.keras.layers.ReLU`

With default values, it returns element-wise max(x, 0).

Otherwise, it follows: f(x) = max_value for x >= max_value, f(x) = x for threshold <= x < max_value, f(x) = negative_slope * (x - threshold) otherwise.

Input shape:

Arbitrary. Use the keyword argument input_shape (tuple of integers, does not include the samples axis) when using this layer as the first layer in a model.

Output shape:

Same shape as the input.

max_value Float >= 0. Maximum activation value.
negative_slope Float >= 0. Negative slope coefficient.
threshold Float. Threshold value for thresholded activation.