tf.keras.optimizers.schedules.PiecewiseConstantDecay
Stay organized with collections
Save and categorize content based on your preferences.
A LearningRateSchedule that uses a piecewise constant decay schedule.
Inherits From: LearningRateSchedule
tf.keras.optimizers.schedules.PiecewiseConstantDecay(
boundaries, values, name=None
)
The function returns a 1-arg callable to compute the piecewise constant
when passed the current optimizer step. This can be useful for changing the
learning rate value across different invocations of optimizer functions.
Example: use a learning rate that's 1.0 for the first 100001 steps, 0.5
for the next 10000 steps, and 0.1 for any additional steps.
step = tf.Variable(0, trainable=False)
boundaries = [100000, 110000]
values = [1.0, 0.5, 0.1]
learning_rate_fn = keras.optimizers.schedules.PiecewiseConstantDecay(
boundaries, values)
# Later, whenever we perform an optimization step, we pass in the step.
learning_rate = learning_rate_fn(step)
You can pass this schedule directly into a tf.keras.optimizers.Optimizer
as the learning rate. The learning rate schedule is also serializable and
deserializable using tf.keras.optimizers.schedules.serialize
and
tf.keras.optimizers.schedules.deserialize
.
Returns |
A 1-arg callable learning rate schedule that takes the current optimizer
step and outputs the decayed learning rate, a scalar Tensor of the same
type as the boundary tensors.
The output of the 1-arg function that takes the step
is values[0] when step <= boundaries[0] ,
values[1] when step > boundaries[0] and step <= boundaries[1] , ...,
and values[-1] when step > boundaries[-1] .
|
Args |
boundaries
|
A list of Tensor s or int s or float s with strictly
increasing entries, and with all elements having the same type as
the optimizer step.
|
values
|
A list of Tensor s or float s or int s that specifies the
values for the intervals defined by boundaries . It should have one
more element than boundaries , and all elements should have the
same type.
|
name
|
A string. Optional name of the operation. Defaults to
'PiecewiseConstant'.
|
Raises |
ValueError
|
if the number of elements in the lists do not match.
|
Methods
from_config
View source
@classmethod
from_config(
config
)
Instantiates a LearningRateSchedule
from its config.
Args |
config
|
Output of get_config() .
|
Returns |
A LearningRateSchedule instance.
|
get_config
View source
get_config()
__call__
View source
__call__(
step
)
Call self as a function.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2023-10-06 UTC.
[null,null,["Last updated 2023-10-06 UTC."],[],[],null,["# tf.keras.optimizers.schedules.PiecewiseConstantDecay\n\n\u003cbr /\u003e\n\n|------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/keras-team/keras/tree/v2.12.0/keras/optimizers/schedules/learning_rate_schedule.py#L208-L313) |\n\nA LearningRateSchedule that uses a piecewise constant decay schedule.\n\nInherits From: [`LearningRateSchedule`](../../../../tf/keras/optimizers/schedules/LearningRateSchedule)\n\n#### View aliases\n\n\n**Main aliases**\n\n[`tf.optimizers.schedules.PiecewiseConstantDecay`](https://www.tensorflow.org/api_docs/python/tf/keras/optimizers/schedules/PiecewiseConstantDecay)\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n\\`tf.compat.v1.keras.optimizers.schedules.PiecewiseConstantDecay\\`\n\n\u003cbr /\u003e\n\n tf.keras.optimizers.schedules.PiecewiseConstantDecay(\n boundaries, values, name=None\n )\n\nThe function returns a 1-arg callable to compute the piecewise constant\nwhen passed the current optimizer step. This can be useful for changing the\nlearning rate value across different invocations of optimizer functions.\n\nExample: use a learning rate that's 1.0 for the first 100001 steps, 0.5\nfor the next 10000 steps, and 0.1 for any additional steps. \n\n step = tf.Variable(0, trainable=False)\n boundaries = [100000, 110000]\n values = [1.0, 0.5, 0.1]\n learning_rate_fn = keras.optimizers.schedules.PiecewiseConstantDecay(\n boundaries, values)\n\n # Later, whenever we perform an optimization step, we pass in the step.\n learning_rate = learning_rate_fn(step)\n\nYou can pass this schedule directly into a [`tf.keras.optimizers.Optimizer`](../../../../tf/keras/optimizers/Optimizer)\nas the learning rate. The learning rate schedule is also serializable and\ndeserializable using [`tf.keras.optimizers.schedules.serialize`](../../../../tf/keras/optimizers/schedules/serialize) and\n[`tf.keras.optimizers.schedules.deserialize`](../../../../tf/keras/optimizers/schedules/deserialize).\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| A 1-arg callable learning rate schedule that takes the current optimizer step and outputs the decayed learning rate, a scalar `Tensor` of the same type as the boundary tensors. \u003cbr /\u003e The output of the 1-arg function that takes the `step` is `values[0]` when `step \u003c= boundaries[0]`, `values[1]` when `step \u003e boundaries[0]` and `step \u003c= boundaries[1]`, ..., and values\\[-1\\] when `step \u003e boundaries[-1]`. ||\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|--------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `boundaries` | A list of `Tensor`s or `int`s or `float`s with strictly increasing entries, and with all elements having the same type as the optimizer step. |\n| `values` | A list of `Tensor`s or `float`s or `int`s that specifies the values for the intervals defined by `boundaries`. It should have one more element than `boundaries`, and all elements should have the same type. |\n| `name` | A string. Optional name of the operation. Defaults to 'PiecewiseConstant'. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|--------------|------------------------------------------------------|\n| `ValueError` | if the number of elements in the lists do not match. |\n\n\u003cbr /\u003e\n\nMethods\n-------\n\n### `from_config`\n\n[View source](https://github.com/keras-team/keras/tree/v2.12.0/keras/optimizers/schedules/learning_rate_schedule.py#L87-L97) \n\n @classmethod\n from_config(\n config\n )\n\nInstantiates a `LearningRateSchedule` from its config.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ||\n|----------|---------------------------|\n| `config` | Output of `get_config()`. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ||\n|---|---|\n| A `LearningRateSchedule` instance. ||\n\n\u003cbr /\u003e\n\n### `get_config`\n\n[View source](https://github.com/keras-team/keras/tree/v2.12.0/keras/optimizers/schedules/learning_rate_schedule.py#L308-L313) \n\n get_config()\n\n### `__call__`\n\n[View source](https://github.com/keras-team/keras/tree/v2.12.0/keras/optimizers/schedules/learning_rate_schedule.py#L277-L306) \n\n __call__(\n step\n )\n\nCall self as a function."]]