TF 2.0 is out! Get hands-on practice at TF World, Oct 28-31. Use code TF20 for 20% off select passes. Register now

Module: tfa.optimizers.weight_decay_optimizers

View source on GitHub

Base class to make optimizers weight decay ready.

Classes

class AdamW: Optimizer that implements the Adam algorithm with weight decay.

class DecoupledWeightDecayExtension: This class allows to extend optimizers with decoupled weight decay.

class SGDW: Optimizer that implements the Momentum algorithm with weight_decay.

Functions

extend_with_decoupled_weight_decay(...): Factory function returning an optimizer class with decoupled weight