TensorFlow 1 version
|
View source on GitHub
|
Compute the Leaky ReLU activation function.
tf.nn.leaky_relu(
features, alpha=0.2, name=None
)
Source: Rectifier Nonlinearities Improve Neural Network Acoustic Models.
AL Maas, AY Hannun, AY Ng - Proc. ICML, 2013.
Args:
features: A Tensor representing preactivation values. Must be one of
the following types: float16, float32, float64, int32, int64.
alpha: Slope of the activation function at x < 0.
name: A name for the operation (optional).
Returns | |
|---|---|
| The activation value. |
References:
Rectifier Nonlinearities Improve Neural Network Acoustic Models: Maas et al., 2013 (pdf)
TensorFlow 1 version
View source on GitHub