A tf.data.Options object can be, for instance, used to control which static
optimizations to apply to the input pipeline graph or whether to use
performance modeling to dynamically tune the parallelism of operations such as
tf.data.Dataset.map or tf.data.Dataset.interleave.
The options are set for the entire dataset and are carried over to datasets
created through tf.data transformations.
The options can be set by constructing an Options object and using the
tf.data.Dataset.with_options(options) transformation, which returns a
dataset with the options set.
This option can be used to override the default policy for how to handle external state when serializing a dataset or checkpointing its iterator. There are three settings available - IGNORE: External state is ignored without a warning; WARN: External state is ignored and a warning is logged; FAIL: External state results in an error.
Whether to introduce 'slack' in the last prefetch of the input pipeline, if it exists. This may reduce CPU contention with accelerator host-side activity at the start of a step. The slack frequency is determined by the number of devices attached to this input pipeline. If None, defaults to False.
experimental_symbolic_checkpoint
Whether to checkpoint internal input pipeline state maintaining cursors into data sources that identify last element(s) produced as output to the tf.data consumer. This is alternative to the default 'explicit' checkpointing which stores the internal input pipeline state in the checkpoint. Note that symbolic checkpointing is not supported for transformations that can reorder elements.
experimental_threading
DEPRECATED. Use threading instead.
threading
The threading options associated with the dataset. See tf.data.ThreadingOptions for more details.
If this object and the options to merge set an option differently, a
warning is generated and this object's value is updated with the options
object's value.
[null,null,["Last updated 2023-10-06 UTC."],[],[],null,["# tf.data.Options\n\n\u003cbr /\u003e\n\n|--------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.13.1/tensorflow/python/data/ops/options.py#L511-L705) |\n\nRepresents options for [`tf.data.Dataset`](../../tf/data/Dataset).\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.data.Options`](https://www.tensorflow.org/api_docs/python/tf/data/Options)\n\n\u003cbr /\u003e\n\n tf.data.Options()\n\nA [`tf.data.Options`](../../tf/data/Options) object can be, for instance, used to control which static\noptimizations to apply to the input pipeline graph or whether to use\nperformance modeling to dynamically tune the parallelism of operations such as\n[`tf.data.Dataset.map`](../../tf/data/Dataset#map) or [`tf.data.Dataset.interleave`](../../tf/data/Dataset#interleave).\n\nThe options are set for the entire dataset and are carried over to datasets\ncreated through tf.data transformations.\n\nThe options can be set by constructing an `Options` object and using the\n[`tf.data.Dataset.with_options(options)`](../../tf/data/Dataset#with_options) transformation, which returns a\ndataset with the options set. \n\n dataset = tf.data.Dataset.range(42)\n options = tf.data.Options()\n options.deterministic = False\n dataset = dataset.with_options(options)\n print(dataset.options().deterministic)\n False\n\n| **Note:** A known limitation of the [`tf.data.Options`](../../tf/data/Options) implementation is that the options are not preserved across tf.function boundaries. In particular, to set options for a dataset that is iterated within a tf.function, the options need to be set within the same tf.function.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Attributes ---------- ||\n|--------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `autotune` | The autotuning options associated with the dataset. See [`tf.data.experimental.AutotuneOptions`](../../tf/data/experimental/AutotuneOptions) for more details. |\n| `deterministic` | Whether the outputs need to be produced in deterministic order. If None, defaults to True. |\n| `experimental_deterministic` | DEPRECATED. Use `deterministic` instead. |\n| `experimental_distribute` | The distribution strategy options associated with the dataset. See [`tf.data.experimental.DistributeOptions`](../../tf/data/experimental/DistributeOptions) for more details. |\n| `experimental_external_state_policy` | This option can be used to override the default policy for how to handle external state when serializing a dataset or checkpointing its iterator. There are three settings available - IGNORE: External state is ignored without a warning; WARN: External state is ignored and a warning is logged; FAIL: External state results in an error. |\n| `experimental_optimization` | The optimization options associated with the dataset. See [`tf.data.experimental.OptimizationOptions`](../../tf/data/experimental/OptimizationOptions) for more details. |\n| `experimental_slack` | Whether to introduce 'slack' in the last `prefetch` of the input pipeline, if it exists. This may reduce CPU contention with accelerator host-side activity at the start of a step. The slack frequency is determined by the number of devices attached to this input pipeline. If None, defaults to False. |\n| `experimental_symbolic_checkpoint` | Whether to checkpoint internal input pipeline state maintaining cursors into data sources that identify last element(s) produced as output to the tf.data consumer. This is alternative to the default 'explicit' checkpointing which stores the internal input pipeline state in the checkpoint. Note that symbolic checkpointing is not supported for transformations that can reorder elements. |\n| `experimental_threading` | DEPRECATED. Use `threading` instead. |\n| `threading` | The threading options associated with the dataset. See [`tf.data.ThreadingOptions`](../../tf/data/ThreadingOptions) for more details. |\n\n\u003cbr /\u003e\n\nMethods\n-------\n\n### `merge`\n\n[View source](https://github.com/tensorflow/tensorflow/blob/v2.13.1/tensorflow/python/data/ops/options.py#L691-L705) \n\n merge(\n options\n )\n\nMerges itself with the given [`tf.data.Options`](../../tf/data/Options).\n\nIf this object and the `options` to merge set an option differently, a\nwarning is generated and this object's value is updated with the `options`\nobject's value.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ||\n|-----------|---------------------------------------------------------------|\n| `options` | The [`tf.data.Options`](../../tf/data/Options) to merge with. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ||\n|---|---|\n| New [`tf.data.Options`](../../tf/data/Options) object which is the result of merging self with the input [`tf.data.Options`](../../tf/data/Options). ||\n\n\u003cbr /\u003e\n\n### `__eq__`\n\n[View source](https://github.com/tensorflow/tensorflow/blob/v2.13.1/tensorflow/python/data/util/options.py#L38-L44) \n\n __eq__(\n other\n )\n\nReturn self==value.\n\n### `__ne__`\n\n[View source](https://github.com/tensorflow/tensorflow/blob/v2.13.1/tensorflow/python/data/util/options.py#L46-L50) \n\n __ne__(\n other\n )\n\nReturn self!=value."]]