Announcing the TensorFlow Dev Summit 2020 Learn more

Module: tfa.optimizers.lazy_adam

View source on GitHub

Variant of the Adam optimizer that handles sparse updates more efficiently.

Compared with the original Adam optimizer, the one in this file can provide a large improvement in model training throughput for some applications. However, it provides slightly different semantics than the original Adam algorithm, and may lead to different empirical results.

Classes

class LazyAdam: Variant of the Adam optimizer that handles sparse updates more