Nadam

public class Nadam

Nadam Optimizer that implements the NAdam algorithm.

Much like Adam is essentially RMSprop with momentum, Nadam is Adam with Nesterov momentum.

See Also

Constants

Inherited Constants

org.tensorflow.framework.optimizers.Optimizer
String VARIABLE_V2

Public Constructors

Nadam(Graph graph)
Creates a Nadam Optimizer
Nadam(Graph graph, float learningRate)
Creates a Nadam Optimizer
Nadam(Graph graph, float learningRate, float betaOne, float betaTwo, float epsilon)
Creates a Nadam Optimizer
Nadam(Graph graph, String name, float learningRate)
Creates a Nadam Optimizer
Nadam(Graph graph, String name, float learningRate, float betaOne, float betaTwo, float epsilon)
Creates a Nadam Optimizer

Public Methods

String
getOptimizerName()
Get the Name of the optimizer.

Inherited Methods

org.tensorflow.framework.optimizers.Optimizer
Op
applyGradients(List<GradAndVar<? extends TType>> gradsAndVars, String name)
Applies gradients to variables
<T extends TType> List<GradAndVar<?>>
computeGradients(Operand<?> loss)
Computes the gradients based on a loss operand.
static String
createName(Output<? extends TType> variable, String slotName)
Creates a name by combining a variable name and a slot name
abstract String
getOptimizerName()
Get the Name of the optimizer.
<T extends TType> Optional<Variable<T>>
getSlot(Output<T> var, String slotName)
Gets the slot associated with the specified variable and slot name.
final Ops
getTF()
Gets the Optimizer's Ops instance
Op
minimize(Operand<?> loss)
Minimizes the loss by updating the variables
Op
minimize(Operand<?> loss, String name)
Minimizes the loss by updating the variables
boolean
equals(Object arg0)
final Class<?>
getClass()
int
hashCode()
final void
notify()
final void
notifyAll()
String
toString()
final void
wait(long arg0, int arg1)
final void
wait(long arg0)
final void
wait()

Constants

public static final float BETA_ONE_DEFAULT

Constant Value: 0.9

public static final float BETA_TWO_DEFAULT

Constant Value: 0.999

public static final float EPSILON_DEFAULT

Constant Value: 1.0E-8

public static final String FIRST_MOMENT

Constant Value: "m"

public static final float LEARNING_RATE_DEFAULT

Constant Value: 0.001

public static final String MOMENTUM

Constant Value: "momentum"

public static final String SECOND_MOMENT

Constant Value: "v"

Public Constructors

public Nadam (Graph graph)

Creates a Nadam Optimizer

Parameters
graph the TensorFlow graph

public Nadam (Graph graph, float learningRate)

Creates a Nadam Optimizer

Parameters
graph the TensorFlow graph
learningRate the learning rate, defaults to 0.001

public Nadam (Graph graph, float learningRate, float betaOne, float betaTwo, float epsilon)

Creates a Nadam Optimizer

Parameters
graph the TensorFlow graph
learningRate the learning rate, defaults to 0.001
betaOne The exponential decay rate for the 1st moment estimates. Default is 0.9.
betaTwo The exponential decay rate for the exponentially weighted infinity norm. Default is 0.999.
epsilon A small constant for numerical stability. Default is 1e-8.

public Nadam (Graph graph, String name, float learningRate)

Creates a Nadam Optimizer

Parameters
graph the TensorFlow graph
name the name for this Optimizer, defaults to "Nadam"
learningRate the learning rate, defaults to 0.001

public Nadam (Graph graph, String name, float learningRate, float betaOne, float betaTwo, float epsilon)

Creates a Nadam Optimizer

Parameters
graph the TensorFlow graph
name the name for this Optimizer, defaults to "Nadam"
learningRate the learning rate, defaults to 0.001
betaOne The exponential decay rate for the 1st moment estimates. Default is 0.9.
betaTwo The exponential decay rate for the exponentially weighted infinity norm. Default is 0.999.
epsilon A small constant for numerical stability. Default is 1e-8.

Public Methods

public String getOptimizerName ()

Get the Name of the optimizer.

Returns
  • The optimizer name.