TensorFlow 2.0 Beta is available Learn more

Module: tf.keras.optimizers

TensorFlow 2.0 version View source on GitHub

Aliases:

  • Module tf.compat.v1.keras.optimizers

Modules

schedules module

Classes

class Adadelta: Optimizer that implements the Adadelta algorithm.

class Adagrad: Optimizer that implements the Adagrad algorithm.

class Adam: Optimizer that implements the Adam algorithm.

class Adamax: Optimizer that implements the Adamax algorithm.

class Ftrl: Optimizer that implements the FTRL algorithm.

class Nadam: Optimizer that implements the NAdam algorithm.

class Optimizer: Updated base class for optimizers.

class RMSprop: Optimizer that implements the RMSprop algorithm.

class SGD: Stochastic gradient descent and momentum optimizer.

Functions

deserialize(...): Inverse of the serialize function.

get(...): Retrieves a Keras Optimizer instance.

serialize(...)