nsl.lib.decay_over_time
Stay organized with collections
Save and categorize content based on your preferences.
Returns a decayed value of init_value
over time.
nsl.lib.decay_over_time(
global_step, decay_config, init_value=1.0
)
When training a model with a regularizer, the objective function can be
formulated as the following:
\[objective = \lambda_1 * loss + \lambda_2 * regularization\]
This function can be used for three cases:
- Incrementally diminishing the importance of the loss term, by applying a
decay function to the \(\lambda_1\) over time. We'll denote this by writing
\(\lambda_1\) = decay_over_time(
init_value
).
- Incrementally increasing the importance of the regularization term, by
setting \(\lambda_2\) =
init_value
- decay_over_time(init_value
).
- Combining the above two cases, namely, setting \(\lambda_1\) =
decay_over_time(
init_value
) and \(\lambda_2\) = init_value
-
decay_over_time(init_value
).
This function requires a global_step
value to compute the decayed value.
Args |
global_step
|
A scalar int32 or int64 Tensor or a Python number. Must be
positive.
|
decay_config
|
A nsl.configs.DecayConfig for computing the decay value.
|
init_value
|
A scalar Tensor to set the initial value to be decayed.
|
Returns |
A scalar float Tensor.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2022-10-28 UTC.
[null,null,["Last updated 2022-10-28 UTC."],[],[],null,["# nsl.lib.decay_over_time\n\n|-------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/neural-structured-learning/blob/v1.4.0/neural_structured_learning/lib/utils.py#L416-L453) |\n\nReturns a decayed value of `init_value` over time. \n\n nsl.lib.decay_over_time(\n global_step, decay_config, init_value=1.0\n )\n\nWhen training a model with a regularizer, the objective function can be\nformulated as the following:\n\n\\\\\\[objective = \\\\lambda_1 \\* loss + \\\\lambda_2 \\* regularization\\\\\\]\n\nThis function can be used for three cases:\n\n1. Incrementally diminishing the importance of the loss term, by applying a decay function to the \\\\(\\\\lambda_1\\\\) over time. We'll denote this by writing \\\\(\\\\lambda_1\\\\) = decay_over_time(`init_value`).\n2. Incrementally increasing the importance of the regularization term, by setting \\\\(\\\\lambda_2\\\\) = `init_value` - decay_over_time(`init_value`).\n3. Combining the above two cases, namely, setting \\\\(\\\\lambda_1\\\\) = decay_over_time(`init_value`) and \\\\(\\\\lambda_2\\\\) = `init_value` - decay_over_time(`init_value`).\n\nThis function requires a `global_step` value to compute the decayed value.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|----------------|---------------------------------------------------------------------------------------------|\n| `global_step` | A scalar `int32` or `int64` Tensor or a Python number. Must be positive. |\n| `decay_config` | A [`nsl.configs.DecayConfig`](../../nsl/configs/DecayConfig) for computing the decay value. |\n| `init_value` | A scalar Tensor to set the initial value to be decayed. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| A scalar `float` Tensor. ||\n\n\u003cbr /\u003e"]]