tf.keras.layers.ThresholdedReLU

Thresholded Rectified Linear Unit.

Inherits From: Layer

It follows:

  f(x) = x for x > theta
  f(x) = 0 otherwise`

Input shape:

Arbitrary. Use the keyword argument input_shape (tuple of integers, does not include the samples axis) when using this layer as the first layer in a model.

Output shape:

Same shape as the input.

theta Float >= 0. Threshold location of activation.