Missed TensorFlow Dev Summit? Check out the video playlist. Watch recordings


TensorFlow 1 version View source on GitHub

Compute the Leaky ReLU activation function.

    features, alpha=0.2, name=None

Source: Rectifier Nonlinearities Improve Neural Network Acoustic Models. AL Maas, AY Hannun, AY Ng - Proc. ICML, 2013.


  • features: A Tensor representing preactivation values. Must be one of the following types: float16, float32, float64, int32, int64.
  • alpha: Slope of the activation function at x < 0.
  • name: A name for the operation (optional).


The activation value.