Built-in optimizer classes.
For more examples see the base class tf.keras.optimizers.Optimizer
.
Modules
experimental
module: Public API for tf.keras.optimizers.experimental namespace.
legacy
module: Public API for tf.keras.optimizers.legacy namespace.
schedules
module: Public API for tf.keras.optimizers.schedules namespace.
Classes
class Adadelta
: Optimizer that implements the Adadelta algorithm.
class Adafactor
: Optimizer that implements the Adafactor algorithm.
class Adagrad
: Optimizer that implements the Adagrad algorithm.
class Adam
: Optimizer that implements the Adam algorithm.
class AdamW
: Optimizer that implements the AdamW algorithm.
class Adamax
: Optimizer that implements the Adamax algorithm.
class Ftrl
: Optimizer that implements the FTRL algorithm.
class Nadam
: Optimizer that implements the Nadam algorithm.
class Optimizer
: Abstract optimizer base class.
class RMSprop
: Optimizer that implements the RMSprop algorithm.
class SGD
: Gradient descent (with momentum) optimizer.
Functions
deserialize(...)
: Inverse of the serialize
function.
get(...)
: Retrieves a Keras Optimizer instance.
serialize(...)
: Serialize the optimizer configuration to JSON compatible python dict.