Thresholded Rectified Linear Unit.
Inherits From: Layer
, Module
View aliases
Compat aliases for migration
See
Migration guide for
more details.
`tf.compat.v1.keras.layers.ThresholdedReLU`
tf.keras.layers.ThresholdedReLU(
theta=1.0, **kwargs
)
It follows:
f(x) = x for x > theta
f(x) = 0 otherwise`
|
Arbitrary. Use the keyword argument input_shape
(tuple of integers, does not include the samples axis)
when using this layer as the first layer in a model.
|
Output shape |
Same shape as the input.
|
Args |
theta
|
Float >= 0. Threshold location of activation.
|