Warning: This project is deprecated. TensorFlow Addons has stopped development, The project will only be providing minimal maintenance releases until May 2024. See the full announcement here or on github.

tfa.activations.rrelu

Randomized leaky rectified liner unit function.

Computes rrelu function:

\[ \mathrm{rrelu}(x) = \begin{cases} x & \text{if } x > 0 \\ a x \end{cases}, \]

where

\[ a \sim \mathcal{U}(\mathrm{lower}, \mathrm{upper}) \]

when training is True; or

\[ a = \frac{\mathrm{lower} + \mathrm{upper} }{2} \]

when training is False.

See Empirical Evaluation of Rectified Activations in Convolutional Network.

Usage:

x = tf.constant([-1.0, 0.0, 1.0])
tfa.activations.rrelu(x, training=False)
<tf.Tensor: shape=(3,), dtype=float32, numpy=array([-0.22916667,  0.        ,  1.        ], dtype=float32)>
tfa.activations.rrelu(x, training=True, seed=2020)
<tf.Tensor: shape=(3,), dtype=float32, numpy=array([-0.22631127,  0.        ,  1.        ], dtype=float32)>
generator = tf.random.Generator.from_seed(2021)
tfa.activations.rrelu(x, training=True, rng=generator)
<tf.Tensor: shape=(3,), dtype=float32, numpy=array([-0.16031083,  0.        ,  1.        ], dtype=float32)>

x A Tensor. Must be one of the following types: bfloat16, float16, float32, float64.
lower float, lower bound for random alpha.
upper float, upper bound for random alpha.
training bool, indicating whether the call is meant for training or inference.
seed int, this sets the operation-level seed.
rng A tf.random.Generator.

result A Tensor. Has the same type as x.