tfa.activations.rrelu

Randomized leaky rectified liner unit function.

Computes rrelu function:

rrelu(x)={xif x>0ax,

where

aU(lower,upper)

when training is True; or

a=lower+upper2

when training is False.

See Empirical Evaluation of Rectified Activations in Convolutional Network.

Usage:

x = tf.constant([-1.0, 0.0, 1.0])
tfa.activations.rrelu(x, training=False)
<tf.Tensor: shape=(3,), dtype=float32, numpy=array([-0.22916667,  0.        ,  1.        ], dtype=float32)>
tfa.activations.rrelu(x, training=True, seed=2020)
<tf.Tensor: shape=(3,), dtype=float32, numpy=array([-0.22631127,  0.        ,  1.        ], dtype=float32)>
generator = tf.random.Generator.from_seed(2021)
tfa.activations.rrelu(x, training=True, rng=generator)
<tf.Tensor: shape=(3,), dtype=float32, numpy=array([-0.16031083,  0.        ,  1.        ], dtype=float32)>

x A Tensor. Must be one of the following types: bfloat16, float16, float32, float64.
lower float, lower bound for random alpha.
upper float, upper bound for random alpha.
training bool, indicating whether the call is meant for training or inference.
seed int, this sets the operation-level seed.
rng A tf.random.Generator.

result A Tensor. Has the same type as x.