public class
AdaGrad
Optimizer that implements the Adagrad algorithm.
Adagrad is an optimizer with parameter-specific learning rates, which are adapted relative to how frequently a parameter gets updated during training. The more updates a parameter receives, the smaller the updates.
Constants
String | ACCUMULATOR | |
float | INITIAL_ACCUMULATOR_DEFAULT | |
float | LEARNING_RATE_DEFAULT |
Inherited Constants
String | VARIABLE_V2 |
Public Constructors
Public Methods
String |
getOptimizerName()
Get the Name of the optimizer.
|
String |
toString()
|
Inherited Methods
Op |
applyGradients(List<GradAndVar<? extends TType>> gradsAndVars, String name)
Applies gradients to variables
|
<T extends TType> List<GradAndVar<?>> | |
static String |
createName(Output<? extends TType> variable, String slotName)
Creates a name by combining a variable name and a slot name
|
abstract String |
getOptimizerName()
Get the Name of the optimizer.
|
<T extends TType> Optional<Variable<T>> | |
final Ops |
getTF()
Gets the Optimizer's Ops instance
|
Op | |
Op |
boolean |
equals(Object arg0)
|
final Class<?> |
getClass()
|
int |
hashCode()
|
final void |
notify()
|
final void |
notifyAll()
|
String |
toString()
|
final void |
wait(long arg0, int arg1)
|
final void |
wait(long arg0)
|
final void |
wait()
|
Constants
public static final String ACCUMULATOR
Constant Value:
"accumulator"
public static final float INITIAL_ACCUMULATOR_DEFAULT
Constant Value:
0.01
public static final float LEARNING_RATE_DEFAULT
Constant Value:
0.001
Public Constructors
public AdaGrad (Graph graph, float learningRate)
Creates an AdaGrad Optimizer
Parameters
graph | the TensorFlow Graph |
---|---|
learningRate | the learning rate |
public AdaGrad (Graph graph, float learningRate, float initialAccumulatorValue)
Creates an AdaGrad Optimizer
Parameters
graph | the TensorFlow Graph |
---|---|
learningRate | the learning rate |
initialAccumulatorValue | Starting value for the accumulators, must be non-negative. |
Throws
IllegalArgumentException | if initialAccumulatorValue is negative |
---|
public AdaGrad (Graph graph, String name, float learningRate)
Creates an AdaGrad Optimizer
Parameters
graph | the TensorFlow Graph |
---|---|
name | the name for this Optimizer (defaults to 'Adagrad') |
learningRate | the learning rate |
public AdaGrad (Graph graph, String name, float learningRate, float initialAccumulatorValue)
Creates an AdaGrad Optimizer
Parameters
graph | the TensorFlow Graph |
---|---|
name | the name for this Optimizer (defaults to 'Adagrad') |
learningRate | the learning rate |
initialAccumulatorValue | Starting value for the accumulators, must be non-negative. |
Throws
IllegalArgumentException | if initialAccumulatorValue is negative |
---|
Public Methods
public String getOptimizerName ()
Get the Name of the optimizer.
Returns
- The optimizer name.
public String toString ()