tf.keras.optimizers.schedules.LearningRateSchedule
Stay organized with collections
Save and categorize content based on your preferences.
The learning rate schedule base class.
Used in the notebooks
You can use a learning rate schedule to modulate how the learning rate
of your optimizer changes over time.
Several built-in learning rate schedules are available, such as
keras.optimizers.schedules.ExponentialDecay
or
keras.optimizers.schedules.PiecewiseConstantDecay
:
lr_schedule = keras.optimizers.schedules.ExponentialDecay(
initial_learning_rate=1e-2,
decay_steps=10000,
decay_rate=0.9)
optimizer = keras.optimizers.SGD(learning_rate=lr_schedule)
A LearningRateSchedule
instance can be passed in as the learning_rate
argument of any optimizer.
To implement your own schedule object, you should implement the __call__
method, which takes a step
argument (scalar integer tensor, the
current training step count).
Like for any other Keras object, you can also optionally
make your object serializable by implementing the get_config
and from_config
methods.
Example:
class MyLRSchedule(keras.optimizers.schedules.LearningRateSchedule):
def __init__(self, initial_learning_rate):
self.initial_learning_rate = initial_learning_rate
def __call__(self, step):
return self.initial_learning_rate / (step + 1)
optimizer = keras.optimizers.SGD(learning_rate=MyLRSchedule(0.1))
Methods
from_config
View source
@classmethod
from_config(
config
)
Instantiates a LearningRateSchedule
from its config.
Args |
config
|
Output of get_config() .
|
Returns |
A LearningRateSchedule instance.
|
get_config
View source
get_config()
__call__
View source
__call__(
step
)
Call self as a function.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2024-06-07 UTC.
[null,null,["Last updated 2024-06-07 UTC."],[],[],null,["# tf.keras.optimizers.schedules.LearningRateSchedule\n\n\u003cbr /\u003e\n\n|-------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/keras-team/keras/tree/v3.3.3/keras/src/optimizers/schedules/learning_rate_schedule.py#L10-L76) |\n\nThe learning rate schedule base class.\n\n### Used in the notebooks\n\n| Used in the tutorials |\n|--------------------------------------------------------------------------------------------------------------------|\n| - [Neural machine translation with a Transformer and Keras](https://www.tensorflow.org/text/tutorials/transformer) |\n\nYou can use a learning rate schedule to modulate how the learning rate\nof your optimizer changes over time.\n\nSeveral built-in learning rate schedules are available, such as\n[`keras.optimizers.schedules.ExponentialDecay`](../../../../tf/keras/optimizers/schedules/ExponentialDecay) or\n[`keras.optimizers.schedules.PiecewiseConstantDecay`](../../../../tf/keras/optimizers/schedules/PiecewiseConstantDecay): \n\n lr_schedule = keras.optimizers.schedules.ExponentialDecay(\n initial_learning_rate=1e-2,\n decay_steps=10000,\n decay_rate=0.9)\n optimizer = keras.optimizers.SGD(learning_rate=lr_schedule)\n\nA `LearningRateSchedule` instance can be passed in as the `learning_rate`\nargument of any optimizer.\n\nTo implement your own schedule object, you should implement the `__call__`\nmethod, which takes a `step` argument (scalar integer tensor, the\ncurrent training step count).\nLike for any other Keras object, you can also optionally\nmake your object serializable by implementing the `get_config`\nand `from_config` methods.\n\n#### Example:\n\n class MyLRSchedule(keras.optimizers.schedules.LearningRateSchedule):\n\n def __init__(self, initial_learning_rate):\n self.initial_learning_rate = initial_learning_rate\n\n def __call__(self, step):\n return self.initial_learning_rate / (step + 1)\n\n optimizer = keras.optimizers.SGD(learning_rate=MyLRSchedule(0.1))\n\nMethods\n-------\n\n### `from_config`\n\n[View source](https://github.com/keras-team/keras/tree/v3.3.3/keras/src/optimizers/schedules/learning_rate_schedule.py#L66-L76) \n\n @classmethod\n from_config(\n config\n )\n\nInstantiates a `LearningRateSchedule` from its config.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ||\n|----------|---------------------------|\n| `config` | Output of `get_config()`. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ||\n|---|---|\n| A `LearningRateSchedule` instance. ||\n\n\u003cbr /\u003e\n\n### `get_config`\n\n[View source](https://github.com/keras-team/keras/tree/v3.3.3/keras/src/optimizers/schedules/learning_rate_schedule.py#L60-L64) \n\n get_config()\n\n### `__call__`\n\n[View source](https://github.com/keras-team/keras/tree/v3.3.3/keras/src/optimizers/schedules/learning_rate_schedule.py#L54-L58) \n\n __call__(\n step\n )\n\nCall self as a function."]]