View source on GitHub
|
Distribution-aware version of Optimizer.
Classes
class AdadeltaOptimizer: Optimizer that implements the Adadelta algorithm.
class AdagradOptimizer: Optimizer that implements the Adagrad algorithm.
class AdamOptimizer: Optimizer that implements the Adam algorithm.
class GradientDescentOptimizer: Optimizer that implements the gradient descent algorithm.
class MomentumOptimizer: Optimizer that implements the Momentum algorithm.
class OptimizerV2: Updated base class for optimizers.
class RMSPropOptimizer: Optimizer that implements the RMSProp algorithm.
View source on GitHub