Module: tfa.optimizers

View source on GitHub

Additional optimizers that conform to Keras API.

Modules

average_wrapper module

conditional_gradient module: Conditional Gradient optimizer.

cyclical_learning_rate module: Cyclical Learning Rate Schedule policies for TensorFlow.

lamb module: Layer-wise Adaptive Moments (LAMB) optimizer.

lazy_adam module: Variant of the Adam optimizer that handles sparse updates more efficiently.

lookahead module

moving_average module

novograd module: NovoGrad for TensorFlow.

proximal_adagrad module: Proximal Adagrad optimizer.

rectified_adam module: Rectified Adam (RAdam) optimizer.

stochastic_weight_averaging module: An implementation of the Stochastic Weight Averaging optimizer.

weight_decay_optimizers module: Base class to make optimizers weight decay ready.

yogi module: Yogi: Extension of yogi adaptive nonconvex optimizer in Keras.

Classes

class AdamW: Optimizer that implements the Adam algorithm with weight decay.

class AveragedOptimizerWrapper: Base class for Keras optimizers.

class ConditionalGradient: Optimizer that implements the Conditional Gradient optimization.

class CyclicalLearningRate: A LearningRateSchedule that uses cyclical schedule.

class ExponentialCyclicalLearningRate: A LearningRateSchedule that uses cyclical schedule.

class LAMB: Optimizer that implements the Layer-wise Adaptive Moments (LAMB).

class LazyAdam: Variant of the Adam optimizer that handles sparse updates more

class Lookahead: This class allows to extend optimizers with the lookahead mechanism.

class MovingAverage: Optimizer that computes a moving average of the variables.

class NovoGrad: Optimizer that implements NovoGrad.

class ProximalAdagrad: Optimizer that implements the Proximal Adagrad algorithm.

class RectifiedAdam: Variant of the Adam optimizer whose adaptive learning rate is rectified

class SGDW: Optimizer that implements the Momentum algorithm with weight_decay.

class SWA: This class extends optimizers with Stochastic Weight Averaging (SWA).

class Triangular2CyclicalLearningRate: A LearningRateSchedule that uses cyclical schedule.

class TriangularCyclicalLearningRate: A LearningRateSchedule that uses cyclical schedule.

class Yogi: Optimizer that implements the Yogi algorithm in Keras.

Functions

extend_with_decoupled_weight_decay(...): Factory function returning an optimizer class with decoupled weight