TF 2.0 is out! Get hands-on practice at TF World, Oct 28-31. Use code TF20 for 20% off select passes. Register now

Module: tf.contrib.optimizer_v2

View source on GitHub

Distribution-aware version of Optimizer.

Classes

class AdadeltaOptimizer: Optimizer that implements the Adadelta algorithm.

class AdagradOptimizer: Optimizer that implements the Adagrad algorithm.

class AdamOptimizer: Optimizer that implements the Adam algorithm.

class GradientDescentOptimizer: Optimizer that implements the gradient descent algorithm.

class MomentumOptimizer: Optimizer that implements the Momentum algorithm.

class OptimizerV2: Updated base class for optimizers.

class RMSPropOptimizer: Optimizer that implements the RMSProp algorithm.