Module: tf.keras.optimizers

DO NOT EDIT.

This file was autogenerated. Do not edit it by hand, since your modifications would be overwritten.

Modules

legacy module: DO NOT EDIT.

schedules module: DO NOT EDIT.

Classes

class Adadelta: Optimizer that implements the Adadelta algorithm.

class Adafactor: Optimizer that implements the Adafactor algorithm.

class Adagrad: Optimizer that implements the Adagrad algorithm.

class Adam: Optimizer that implements the Adam algorithm.

class AdamW: Optimizer that implements the AdamW algorithm.

class Adamax: Optimizer that implements the Adamax algorithm.

class Ftrl: Optimizer that implements the FTRL algorithm.

class Lion: Optimizer that implements the Lion algorithm.

class LossScaleOptimizer: An optimizer that dynamically scales the loss to prevent underflow.

class Nadam: Optimizer that implements the Nadam algorithm.

class Optimizer: A class for Tensorflow specific optimizer logic.

class RMSprop: Optimizer that implements the RMSprop algorithm.

class SGD: Gradient descent (with momentum) optimizer.

Functions

deserialize(...): Returns a Keras optimizer object via its configuration.

get(...): Retrieves a Keras Optimizer instance.

serialize(...): Returns the optimizer configuration as a Python dict.