TensorFlow 2.0 RC is available Learn more

tf.keras.layers.ReLU

TensorFlow 2.0 version View source on GitHub

Class ReLU

Rectified Linear Unit activation function.

Inherits From: Layer

Aliases:

  • Class tf.compat.v1.keras.layers.ReLU
  • Class tf.compat.v2.keras.layers.ReLU

With default values, it returns element-wise max(x, 0).

Otherwise, it follows: f(x) = max_value for x >= max_value, f(x) = x for threshold <= x < max_value, f(x) = negative_slope * (x - threshold) otherwise.

Input shape:

Arbitrary. Use the keyword argument input_shape (tuple of integers, does not include the samples axis) when using this layer as the first layer in a model.

Output shape:

Same shape as the input.

Arguments:

  • max_value: Float >= 0. Maximum activation value.
  • negative_slope: Float >= 0. Negative slope coefficient.
  • threshold: Float. Threshold value for thresholded activation.

__init__

View source

__init__(
    max_value=None,
    negative_slope=0,
    threshold=0,
    **kwargs
)