tf.keras.experimental.SequenceFeatures
Stay organized with collections
Save and categorize content based on your preferences.
A layer for sequence input.
Inherits From: Layer
, Module
tf.keras.experimental.SequenceFeatures(
feature_columns, trainable=True, name=None, **kwargs
)
All feature_columns
must be sequence dense columns with the same
sequence_length
. The output of this method can be fed into sequence
networks, such as RNN.
The output of this method is a 3D Tensor
of shape [batch_size, T, D]
.
T
is the maximum sequence length for this batch, which could differ from
batch to batch.
If multiple feature_columns
are given with Di
num_elements
each, their
outputs are concatenated. So, the final Tensor
has shape
[batch_size, T, D0 + D1 + ... + Dn]
.
Example:
import tensorflow as tf
# Behavior of some cells or feature columns may depend on whether we are in
# training or inference mode, e.g. applying dropout.
training = True
rating = tf.feature_column.sequence_numeric_column('rating')
watches = tf.feature_column.sequence_categorical_column_with_identity(
'watches', num_buckets=1000)
watches_embedding = tf.feature_column.embedding_column(watches,
dimension=10)
columns = [rating, watches_embedding]
features = {
'rating': tf.sparse.from_dense([[1.0,1.1, 0, 0, 0],
[2.0,2.1,2.2, 2.3, 2.5]]),
'watches': tf.sparse.from_dense([[2, 85, 0, 0, 0],[33,78, 2, 73, 1]])
}
sequence_input_layer = tf.keras.experimental.SequenceFeatures(columns)
sequence_input, sequence_length = sequence_input_layer(
features, training=training)
sequence_length_mask = tf.sequence_mask(sequence_length)
hidden_size = 32
rnn_cell = tf.keras.layers.SimpleRNNCell(hidden_size)
rnn_layer = tf.keras.layers.RNN(rnn_cell)
outputs, state = rnn_layer(sequence_input, mask=sequence_length_mask)
Args |
feature_columns
|
An iterable of dense sequence columns. Valid columns
are
embedding_column that wraps a
sequence_categorical_column_with_*
sequence_numeric_column .
|
trainable
|
Boolean, whether the layer's variables will be updated via
gradient descent during training.
|
name
|
Name to give to the SequenceFeatures.
|
**kwargs
|
Keyword arguments to construct a layer.
|
Raises |
ValueError
|
If any of the feature_columns is not a
SequenceDenseColumn .
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2023-10-06 UTC.
[null,null,["Last updated 2023-10-06 UTC."],[],[],null,["# tf.keras.experimental.SequenceFeatures\n\n\u003cbr /\u003e\n\n|------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/keras-team/keras/tree/v2.14.0/keras/feature_column/sequence_feature_column.py#L33-L183) |\n\nA layer for sequence input.\n\nInherits From: [`Layer`](../../../tf/keras/layers/Layer), [`Module`](../../../tf/Module) \n\n tf.keras.experimental.SequenceFeatures(\n feature_columns, trainable=True, name=None, **kwargs\n )\n\nAll `feature_columns` must be sequence dense columns with the same\n`sequence_length`. The output of this method can be fed into sequence\nnetworks, such as RNN.\n\nThe output of this method is a 3D `Tensor` of shape `[batch_size, T, D]`.\n`T` is the maximum sequence length for this batch, which could differ from\nbatch to batch.\n\nIf multiple `feature_columns` are given with `Di` `num_elements` each, their\noutputs are concatenated. So, the final `Tensor` has shape\n`[batch_size, T, D0 + D1 + ... + Dn]`.\n\n#### Example:\n\n\n import tensorflow as tf\n\n # Behavior of some cells or feature columns may depend on whether we are in\n # training or inference mode, e.g. applying dropout.\n training = True\n rating = tf.feature_column.sequence_numeric_column('rating')\n watches = tf.feature_column.sequence_categorical_column_with_identity(\n 'watches', num_buckets=1000)\n watches_embedding = tf.feature_column.embedding_column(watches,\n dimension=10)\n columns = [rating, watches_embedding]\n\n features = {\n 'rating': tf.sparse.from_dense([[1.0,1.1, 0, 0, 0],\n [2.0,2.1,2.2, 2.3, 2.5]]),\n 'watches': tf.sparse.from_dense([[2, 85, 0, 0, 0],[33,78, 2, 73, 1]])\n }\n\n sequence_input_layer = tf.keras.experimental.SequenceFeatures(columns)\n sequence_input, sequence_length = sequence_input_layer(\n features, training=training)\n sequence_length_mask = tf.sequence_mask(sequence_length)\n hidden_size = 32\n rnn_cell = tf.keras.layers.SimpleRNNCell(hidden_size)\n rnn_layer = tf.keras.layers.RNN(rnn_cell)\n outputs, state = rnn_layer(sequence_input, mask=sequence_length_mask)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `feature_columns` | An iterable of dense sequence columns. Valid columns are \u003cbr /\u003e - `embedding_column` that wraps a `sequence_categorical_column_with_*` - `sequence_numeric_column`. |\n| `trainable` | Boolean, whether the layer's variables will be updated via gradient descent during training. |\n| `name` | Name to give to the SequenceFeatures. |\n| `**kwargs` | Keyword arguments to construct a layer. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|--------------|-----------------------------------------------------------------|\n| `ValueError` | If any of the `feature_columns` is not a `SequenceDenseColumn`. |\n\n\u003cbr /\u003e"]]