Computes Concatenated ReLU.
tf.nn.crelu(
features, axis=-1, name=None
)
Concatenates a ReLU which selects only the positive part of the activation with a ReLU which selects only the negative part of the activation. Note that as a result this non-linearity doubles the depth of the activations. Source: Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units. W. Shang, et al.
Returns | |
---|---|
A Tensor with the same type as features .
|
References | |
---|---|
Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units: Shang et al., 2016 (pdf) |