AdaGrad

public class AdaGrad

Optimizer that implements the Adagrad algorithm.

Adagrad is an optimizer with parameter-specific learning rates, which are adapted relative to how frequently a parameter gets updated during training. The more updates a parameter receives, the smaller the updates.

Constants

Inherited Constants

org.tensorflow.framework.optimizers.Optimizer
String VARIABLE_V2

Public Constructors

AdaGrad(Graph graph)
Creates an AdaGrad Optimizer
AdaGrad(Graph graph, float learningRate)
Creates an AdaGrad Optimizer
AdaGrad(Graph graph, float learningRate, float initialAccumulatorValue)
Creates an AdaGrad Optimizer
AdaGrad(Graph graph, String name, float learningRate)
Creates an AdaGrad Optimizer
AdaGrad(Graph graph, String name, float learningRate, float initialAccumulatorValue)
Creates an AdaGrad Optimizer

Public Methods

String
getOptimizerName()
Get the Name of the optimizer.
String

Inherited Methods

org.tensorflow.framework.optimizers.Optimizer
Op
applyGradients(List<GradAndVar<? extends TType>> gradsAndVars, String name)
Applies gradients to variables
<T extends TType> List<GradAndVar<?>>
computeGradients(Operand<?> loss)
Computes the gradients based on a loss operand.
static String
createName(Output<? extends TType> variable, String slotName)
Creates a name by combining a variable name and a slot name
abstract String
getOptimizerName()
Get the Name of the optimizer.
<T extends TType> Optional<Variable<T>>
getSlot(Output<T> var, String slotName)
Gets the slot associated with the specified variable and slot name.
final Ops
getTF()
Gets the Optimizer's Ops instance
Op
minimize(Operand<?> loss)
Minimizes the loss by updating the variables
Op
minimize(Operand<?> loss, String name)
Minimizes the loss by updating the variables
boolean
equals(Object arg0)
final Class<?>
getClass()
int
hashCode()
final void
notify()
final void
notifyAll()
String
toString()
final void
wait(long arg0, int arg1)
final void
wait(long arg0)
final void
wait()

Constants

public static final String ACCUMULATOR

Constant Value: "accumulator"

public static final float INITIAL_ACCUMULATOR_DEFAULT

Constant Value: 0.01

public static final float LEARNING_RATE_DEFAULT

Constant Value: 0.001

Public Constructors

public AdaGrad (Graph graph)

Creates an AdaGrad Optimizer

Parameters
graph the TensorFlow Graph

public AdaGrad (Graph graph, float learningRate)

Creates an AdaGrad Optimizer

Parameters
graph the TensorFlow Graph
learningRate the learning rate

public AdaGrad (Graph graph, float learningRate, float initialAccumulatorValue)

Creates an AdaGrad Optimizer

Parameters
graph the TensorFlow Graph
learningRate the learning rate
initialAccumulatorValue Starting value for the accumulators, must be non-negative.
Throws
IllegalArgumentException if initialAccumulatorValue is negative

public AdaGrad (Graph graph, String name, float learningRate)

Creates an AdaGrad Optimizer

Parameters
graph the TensorFlow Graph
name the name for this Optimizer (defaults to 'Adagrad')
learningRate the learning rate

public AdaGrad (Graph graph, String name, float learningRate, float initialAccumulatorValue)

Creates an AdaGrad Optimizer

Parameters
graph the TensorFlow Graph
name the name for this Optimizer (defaults to 'Adagrad')
learningRate the learning rate
initialAccumulatorValue Starting value for the accumulators, must be non-negative.
Throws
IllegalArgumentException if initialAccumulatorValue is negative

Public Methods

public String getOptimizerName ()

Get the Name of the optimizer.

Returns
  • The optimizer name.

public String toString ()