AdaGrad

public class AdaGrad

Optimizer that implements the Adagrad algorithm.

Adagrad is an optimizer with parameter-specific learning rates, which are adapted relative to how frequently a parameter gets updated during training. The more updates a parameter receives, the smaller the updates.

Constants

String ACCUMULATOR
float INITIAL_ACCUMULATOR_DEFAULT
float LEARNING_RATE_DEFAULT

Inherited Constants

Public Constructors

AdaGrad(Graph graph)
Creates an AdaGrad Optimizer
AdaGrad(Graph graph, float learningRate)
Creates an AdaGrad Optimizer
AdaGrad(Graph graph, float learningRate, float initialAccumulatorValue)
Creates an AdaGrad Optimizer
AdaGrad(Graph graph, String name, float learningRate)
Creates an AdaGrad Optimizer
AdaGrad(Graph graph, String name, float learningRate, float initialAccumulatorValue)
Creates an AdaGrad Optimizer

Public Methods

String
getOptimizerName()
Get the Name of the optimizer.
String

Inherited Methods

Constants

public static final String ACCUMULATOR

Constant Value: "accumulator"

public static final float INITIAL_ACCUMULATOR_DEFAULT

Constant Value: 0.01

public static final float LEARNING_RATE_DEFAULT

Constant Value: 0.001

Public Constructors

public AdaGrad (Graph graph)

Creates an AdaGrad Optimizer

Parameters
graph the TensorFlow Graph

public AdaGrad (Graph graph, float learningRate)

Creates an AdaGrad Optimizer

Parameters
graph the TensorFlow Graph
learningRate the learning rate

public AdaGrad (Graph graph, float learningRate, float initialAccumulatorValue)

Creates an AdaGrad Optimizer

Parameters
graph the TensorFlow Graph
learningRate the learning rate
initialAccumulatorValue Starting value for the accumulators, must be non-negative.
Throws
IllegalArgumentException if initialAccumulatorValue is negative

public AdaGrad (Graph graph, String name, float learningRate)

Creates an AdaGrad Optimizer

Parameters
graph the TensorFlow Graph
name the name for this Optimizer (defaults to 'Adagrad')
learningRate the learning rate

public AdaGrad (Graph graph, String name, float learningRate, float initialAccumulatorValue)

Creates an AdaGrad Optimizer

Parameters
graph the TensorFlow Graph
name the name for this Optimizer (defaults to 'Adagrad')
learningRate the learning rate
initialAccumulatorValue Starting value for the accumulators, must be non-negative.
Throws
IllegalArgumentException if initialAccumulatorValue is negative

Public Methods

public String getOptimizerName ()

Get the Name of the optimizer.

Returns
  • The optimizer name.

public String toString ()