tf.keras.optimizers.Adagrad

Optimizer that implements the Adagrad algorithm.

Inherits From: Optimizer

Used in the notebooks

Used in the tutorials

Adagrad is an optimizer with parameter-specific learning rates, which are adapted relative to how frequently a parameter gets updated during training. The more updates a parameter receives, the smaller the updates.

learning_rate A Tensor, floating point value, or a schedule that is a