View source on GitHub |
Thresholded Rectified Linear Unit.
tf.keras.layers.ThresholdedReLU(
theta=1.0, **kwargs
)
It follows:
f(x) = x for x > theta
f(x) = 0 otherwise`
Input shape | |
---|---|
Arbitrary. Use the keyword argument input_shape
(tuple of integers, does not include the samples axis)
when using this layer as the first layer in a model.
|
Output shape | |
---|---|
Same shape as the input. |
Args | |
---|---|
theta
|
Float >= 0. Threshold location of activation. |