tf.contrib.feature_column.sequence_input_layer
Stay organized with collections
Save and categorize content based on your preferences.
"Builds input layer for sequence input.
tf.contrib.feature_column.sequence_input_layer(
features, feature_columns, weight_collections=None, trainable=True
)
All feature_columns
must be sequence dense columns with the same
sequence_length
. The output of this method can be fed into sequence
networks, such as RNN.
The output of this method is a 3D Tensor
of shape [batch_size, T, D]
.
T
is the maximum sequence length for this batch, which could differ from
batch to batch.
If multiple feature_columns
are given with Di
num_elements
each, their
outputs are concatenated. So, the final Tensor
has shape
[batch_size, T, D0 + D1 + ... + Dn]
.
Example:
rating = sequence_numeric_column('rating')
watches = sequence_categorical_column_with_identity(
'watches', num_buckets=1000)
watches_embedding = embedding_column(watches, dimension=10)
columns = [rating, watches]
features = tf.io.parse_example(..., features=make_parse_example_spec(columns))
input_layer, sequence_length = sequence_input_layer(features, columns)
rnn_cell = tf.compat.v1.nn.rnn_cell.BasicRNNCell(hidden_size)
outputs, state = tf.compat.v1.nn.dynamic_rnn(
rnn_cell, inputs=input_layer, sequence_length=sequence_length)
Args |
features
|
A dict mapping keys to tensors.
|
feature_columns
|
An iterable of dense sequence columns. Valid columns are
embedding_column that wraps a sequence_categorical_column_with_*
sequence_numeric_column .
|
weight_collections
|
A list of collection names to which the Variable will be
added. Note that variables will also be added to collections
tf.GraphKeys.GLOBAL_VARIABLES and ops.GraphKeys.MODEL_VARIABLES .
|
trainable
|
If True also add the variable to the graph collection
GraphKeys.TRAINABLE_VARIABLES .
|
Returns |
An (input_layer, sequence_length) tuple where:
- input_layer: A float
Tensor of shape [batch_size, T, D] .
T is the maximum sequence length for this batch, which could differ
from batch to batch. D is the sum of num_elements for all
feature_columns .
- sequence_length: An int
Tensor of shape [batch_size] . The sequence
length for each example.
|
Raises |
ValueError
|
If any of the feature_columns is the wrong type.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.contrib.feature_column.sequence_input_layer\n\n\u003cbr /\u003e\n\n|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v1.15.0/tensorflow/contrib/feature_column/python/feature_column/sequence_feature_column.py#L39-L134) |\n\n\"Builds input layer for sequence input. \n\n tf.contrib.feature_column.sequence_input_layer(\n features, feature_columns, weight_collections=None, trainable=True\n )\n\nAll `feature_columns` must be sequence dense columns with the same\n`sequence_length`. The output of this method can be fed into sequence\nnetworks, such as RNN.\n\nThe output of this method is a 3D `Tensor` of shape `[batch_size, T, D]`.\n`T` is the maximum sequence length for this batch, which could differ from\nbatch to batch.\n\nIf multiple `feature_columns` are given with `Di` `num_elements` each, their\noutputs are concatenated. So, the final `Tensor` has shape\n`[batch_size, T, D0 + D1 + ... + Dn]`.\n\n#### Example:\n\n rating = sequence_numeric_column('rating')\n watches = sequence_categorical_column_with_identity(\n 'watches', num_buckets=1000)\n watches_embedding = embedding_column(watches, dimension=10)\n columns = [rating, watches]\n\n features = tf.io.parse_example(..., features=make_parse_example_spec(columns))\n input_layer, sequence_length = sequence_input_layer(features, columns)\n\n rnn_cell = tf.compat.v1.nn.rnn_cell.BasicRNNCell(hidden_size)\n outputs, state = tf.compat.v1.nn.dynamic_rnn(\n rnn_cell, inputs=input_layer, sequence_length=sequence_length)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|----------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `features` | A dict mapping keys to tensors. |\n| `feature_columns` | An iterable of dense sequence columns. Valid columns are \u003cbr /\u003e - `embedding_column` that wraps a `sequence_categorical_column_with_*` - `sequence_numeric_column`. |\n| `weight_collections` | A list of collection names to which the Variable will be added. Note that variables will also be added to collections [`tf.GraphKeys.GLOBAL_VARIABLES`](../../../tf/GraphKeys#GLOBAL_VARIABLES) and `ops.GraphKeys.MODEL_VARIABLES`. |\n| `trainable` | If `True` also add the variable to the graph collection `GraphKeys.TRAINABLE_VARIABLES`. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| An `(input_layer, sequence_length)` tuple where: \u003cbr /\u003e - input_layer: A float `Tensor` of shape `[batch_size, T, D]`. `T` is the maximum sequence length for this batch, which could differ from batch to batch. `D` is the sum of `num_elements` for all `feature_columns`. - sequence_length: An int `Tensor` of shape `[batch_size]`. The sequence length for each example. ||\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|--------------|----------------------------------------------------|\n| `ValueError` | If any of the `feature_columns` is the wrong type. |\n\n\u003cbr /\u003e"]]