Module: tfm.optimization.lr_schedule

Learning rate schedule classes.

Classes

class CosineDecayWithOffset: A LearningRateSchedule that uses a cosine decay with optional warmup.

class DirectPowerDecay: Learning rate schedule follows lr * (step)^power.

class ExponentialDecayWithOffset: A LearningRateSchedule that uses an exponential decay schedule.

class LinearWarmup: Linear warmup schedule.

class PiecewiseConstantDecayWithOffset: A LearningRateSchedule that uses a piecewise constant decay schedule.

class PolynomialDecayWithOffset: A LearningRateSchedule that uses a polynomial decay schedule.

class PolynomialWarmUp: Applies polynomial warmup schedule on a given learning rate decay schedule.

class PowerAndLinearDecay: Learning rate schedule with multiplied by linear decay at the end.

class PowerDecayWithOffset: Power learning rate decay with offset.

class StepCosineDecayWithOffset: Stepwise cosine learning rate decay with offset.