Given file pattern (or list of files), will setup a queue for file names,
read Example proto using provided reader, use batch queue to create
batches of examples of size batch_size.
All queue runners are added to the queue runners collection, and may be
started via start_queue_runners.
All ops are added to the default graph.
Use parse_fn if you need to do parsing / processing on single examples.
Args
file_pattern
List of files or patterns of file paths containing
Example records. See tf.io.gfile.glob for pattern rules.
batch_size
An int or scalar Tensor specifying the batch size to use.
reader
A function or class that returns an object with
read method, (filename tensor) -> (example tensor).
randomize_input
Whether the input should be randomized.
num_epochs
Integer specifying the number of times to read through the
dataset. If None, cycles through the dataset forever.
NOTE - If specified, creates a variable that must be initialized, so call
tf.compat.v1.local_variables_initializer() and run the op in a session.
queue_capacity
Capacity for input queue.
num_threads
The number of threads enqueuing examples. In order to have
predictable and repeatable order of reading and enqueueing, such as in
prediction and evaluation mode, num_threads should be 1.
read_batch_size
An int or scalar Tensor specifying the number of
records to read at once.
parse_fn
Parsing function, takes Example Tensor returns parsed
representation. If None, no parsing is done.
name
Name of resulting op.
seed
An integer (optional). Seed used if randomize_input == True.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.contrib.learn.read_keyed_batch_examples\n\n\u003cbr /\u003e\n\n|-----------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v1.15.0/tensorflow/contrib/learn/python/learn/learn_io/graph_io.py#L116-L183) |\n\nAdds operations to read, queue, batch `Example` protos. (deprecated) \n\n tf.contrib.learn.read_keyed_batch_examples(\n file_pattern, batch_size, reader, randomize_input=True, num_epochs=None,\n queue_capacity=10000, num_threads=1, read_batch_size=1, parse_fn=None,\n name=None, seed=None\n )\n\n| **Warning:** THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: Use tf.data.\n\nGiven file pattern (or list of files), will setup a queue for file names,\nread `Example` proto using provided `reader`, use batch queue to create\nbatches of examples of size `batch_size`.\n\nAll queue runners are added to the queue runners collection, and may be\nstarted via `start_queue_runners`.\n\nAll ops are added to the default graph.\n\nUse `parse_fn` if you need to do parsing / processing on single examples.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `file_pattern` | List of files or patterns of file paths containing `Example` records. See [`tf.io.gfile.glob`](../../../tf/io/gfile/glob) for pattern rules. |\n| `batch_size` | An int or scalar `Tensor` specifying the batch size to use. |\n| `reader` | A function or class that returns an object with `read` method, (filename tensor) -\\\u003e (example tensor). |\n| `randomize_input` | Whether the input should be randomized. |\n| `num_epochs` | Integer specifying the number of times to read through the dataset. If `None`, cycles through the dataset forever. NOTE - If specified, creates a variable that must be initialized, so call [`tf.compat.v1.local_variables_initializer()`](../../../tf/initializers/local_variables) and run the op in a session. |\n| `queue_capacity` | Capacity for input queue. |\n| `num_threads` | The number of threads enqueuing examples. In order to have predictable and repeatable order of reading and enqueueing, such as in prediction and evaluation mode, `num_threads` should be 1. |\n| `read_batch_size` | An int or scalar `Tensor` specifying the number of records to read at once. |\n| `parse_fn` | Parsing function, takes `Example` Tensor returns parsed representation. If `None`, no parsing is done. |\n| `name` | Name of resulting op. |\n| `seed` | An integer (optional). Seed used if randomize_input == True. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| Returns tuple of: \u003cbr /\u003e - `Tensor` of string keys. - String `Tensor` of batched `Example` proto. ||\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|--------------|---------------------|\n| `ValueError` | for invalid inputs. |\n\n\u003cbr /\u003e"]]