Module: tfa.optimizers.weight_decay_optimizers

View source on GitHub

Base class to make optimizers weight decay ready.

Classes

class AdamW: Optimizer that implements the Adam algorithm with weight decay.

class DecoupledWeightDecayExtension: This class allows to extend optimizers with decoupled weight decay.

class SGDW: Optimizer that implements the Momentum algorithm with weight_decay.

Functions

extend_with_decoupled_weight_decay(...): Factory function returning an optimizer class with decoupled weight