ELU

public class ELU

Exponential linear unit.

The exponential linear unit (ELU) with alpha > 0 is:

x if x > 0 and alpha * (exp(x) - 1) if x < 0.

The ELU hyperparameter alpha controls the value to which an ELU saturates for negative net inputs. ELUs diminish the vanishing gradient effect.

ELUs have negative values which pushes the mean of the activations closer to zero. Mean activations that are closer to zero enable faster learning as they bring the gradient closer to the natural gradient. ELUs saturate to a negative value when the argument gets smaller. Saturation means a small derivative which decreases the variation and the information that is propagated to the next layer.

Example Usage:

     Operand<TFloat32> input = ...;
     ELU<TFloat32> elu = new ELU<>(tf, 2.0f);
     Operand<TFloat32> result = elu.call(input);
 

Public Constructors

ELU(Ops tf)
Creates a new ELU with alpha=ERROR(/#ALPHA_DEFAULT).
ELU(Ops tf, double alpha)
Creates a new ELU

Public Methods

Operand<T>
call(Operand<T> input)
Gets the calculation operation for the activation.

Inherited Methods

org.tensorflow.framework.activations.Activation
abstract Operand<T>
call(Operand<T> input)
Gets the calculation operation for the activation.
boolean
equals(Object arg0)
final Class<?>
getClass()
int
hashCode()
final void
notify()
final void
notifyAll()
String
toString()
final void
wait(long arg0, int arg1)
final void
wait(long arg0)
final void
wait()

Public Constructors

public ELU (Ops tf)

Creates a new ELU with alpha=ERROR(/#ALPHA_DEFAULT).

Parameters
tf the TensorFlow Ops

public ELU (Ops tf, double alpha)

Creates a new ELU

Parameters
tf the TensorFlow Ops
alpha A scalar, slope of negative section. It controls the value to which an ELU saturates for negative net inputs.

Public Methods

public Operand<T> call (Operand<T> input)

Gets the calculation operation for the activation.

Parameters
input the input tensor
Returns
  • The operand for the activation