ML Community Day is November 9! Join us for updates from TensorFlow, JAX, and more Learn more

tf.raw_ops.Elu

Computes the exponential linear function.

The ELU function is defined as:

  • $ e ^ x - 1 $ if $ x < 0 $
  • $ x $ if $ x >= 0 $

Examples:

tf.nn.elu(1.0)
<tf.Tensor: shape=(), dtype=float32, numpy=1.0>
tf.nn.elu(0.0)
<tf.Tensor: shape=(), dtype=float32, numpy=0.0>
tf.nn.elu(-1000.0)
<tf.Tensor: shape=(), dtype=float32, numpy=-1.0>

See Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)

features A Tensor. Must be one of the following types: half, bfloat16, float32, float64.
name A name for the operation (optional).

A Tensor. Has the same type as features.