tfm.optimization.PowerDecayWithOffset

Power learning rate decay with offset.

Learning rate equals to pre_offset_learning_rate if step < offset. Otherwise, learning rate equals to lr * (step - offset)^power.

initial_learning_rate The initial learning rate.
power The order of the polynomial.
offset The offset when computing the power decay.
pre_offset_learning_rate The maximum learning rate we'll use.
name Optional, name of learning rate schedule.

Methods

from_config

Instantiates a LearningRateSchedule from its config.

Args
config Output of get_config().

Returns
A LearningRateSchedule instance.

get_config

View source

Get the configuration of the learning rate schedule.

__call__

View source

Call self as a function.