Classes
AdaDelta | Optimizer that implements the Adadelta algorithm. |
AdaGrad | Optimizer that implements the Adagrad algorithm. |
AdaGradDA | Optimizer that implements the Adagrad Dual-Averaging algorithm. |
Adam | Optimizer that implements the Adam algorithm. |
Adamax | Optimizer that implements the Adamax algorithm. |
Ftrl | Optimizer that implements the FTRL algorithm. |
GradientDescent | Basic Stochastic gradient descent optimizer. |
Momentum | Stochastic gradient descent plus momentum, either nesterov or traditional. |
Nadam | Nadam Optimizer that implements the NAdam algorithm. |
Optimizer | Base class for gradient optimizers. |
Optimizer.GradAndVar<T extends TType> | A class that holds a paired gradient and variable. |
Optimizer.Options | Optional attributes for Optimizer |
RMSProp | Optimizer that implements the RMSProp algorithm. |
Enums
Optimizers | Enumerator used to create a new Optimizer with default parameters. |