Apply to speak at TensorFlow World. Deadline April 23rd. Propose talk

Module: tf.keras.optimizers

Defined in tensorflow/_api/v1/keras/optimizers/

Built-in optimizer classes.


class Adadelta: Adadelta optimizer.

class Adagrad: Adagrad optimizer.

class Adam: Adam optimizer.

class Adamax: Adamax optimizer from Adam paper's Section 7.

class Nadam: Nesterov Adam optimizer.

class Optimizer: Abstract optimizer base class.

class RMSprop: RMSProp optimizer.

class SGD: Stochastic gradient descent optimizer.


deserialize(...): Inverse of the serialize function.

get(...): Retrieves a Keras Optimizer instance.