tf.keras.layers.TimeDistributed
Stay organized with collections
Save and categorize content based on your preferences.
This wrapper allows to apply a layer to every temporal slice of an input.
Inherits From: Wrapper
tf.keras.layers.TimeDistributed(
layer, **kwargs
)
The input should be at least 3D, and the dimension of index one
will be considered to be the temporal dimension.
Consider a batch of 32 video samples, where each sample is a 128x128 RGB image
with channels_last
data format, across 10 timesteps.
The batch input shape is (32, 10, 128, 128, 3)
.
You can then use TimeDistributed
to apply a Conv2D
layer to each of the
10 timesteps, independently:
inputs = tf.keras.Input(shape=(10, 128, 128, 3))
conv_2d_layer = tf.keras.layers.Conv2D(64, (3, 3))
outputs = tf.keras.layers.TimeDistributed(conv_2d_layer)(inputs)
outputs.shape
TensorShape([None, 10, 126, 126, 64])
Call arguments:
inputs
: Input tensor.
training
: Python boolean indicating whether the layer should behave in
training mode or in inference mode. This argument is passed to the
wrapped layer (only if the layer supports this argument).
mask
: Binary tensor of shape (samples, timesteps)
indicating whether
a given timestep should be masked. This argument is passed to the
wrapped layer (only if the layer supports this argument).
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.keras.layers.TimeDistributed\n\n\u003cbr /\u003e\n\n|-----------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------|\n| [TensorFlow 1 version](/versions/r1.15/api_docs/python/tf/keras/layers/TimeDistributed) | [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.2.0/tensorflow/python/keras/layers/wrappers.py#L90-L327) |\n\nThis wrapper allows to apply a layer to every temporal slice of an input.\n\nInherits From: [`Wrapper`](../../../tf/keras/layers/Wrapper)\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.keras.layers.TimeDistributed`](/api_docs/python/tf/keras/layers/TimeDistributed)\n\n\u003cbr /\u003e\n\n tf.keras.layers.TimeDistributed(\n layer, **kwargs\n )\n\nThe input should be at least 3D, and the dimension of index one\nwill be considered to be the temporal dimension.\n\nConsider a batch of 32 video samples, where each sample is a 128x128 RGB image\nwith `channels_last` data format, across 10 timesteps.\nThe batch input shape is `(32, 10, 128, 128, 3)`.\n\nYou can then use `TimeDistributed` to apply a `Conv2D` layer to each of the\n10 timesteps, independently: \n\n inputs = tf.keras.Input(shape=(10, 128, 128, 3))\n conv_2d_layer = tf.keras.layers.Conv2D(64, (3, 3))\n outputs = tf.keras.layers.TimeDistributed(conv_2d_layer)(inputs)\n outputs.shape\n TensorShape([None, 10, 126, 126, 64])\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Arguments --------- ||\n|---------|-----------------------------------------------------------------------|\n| `layer` | a [`tf.keras.layers.Layer`](../../../tf/keras/layers/Layer) instance. |\n\n\u003cbr /\u003e\n\n#### Call arguments:\n\n- **`inputs`**: Input tensor.\n- **`training`**: Python boolean indicating whether the layer should behave in training mode or in inference mode. This argument is passed to the wrapped layer (only if the layer supports this argument).\n- **`mask`** : Binary tensor of shape `(samples, timesteps)` indicating whether a given timestep should be masked. This argument is passed to the wrapped layer (only if the layer supports this argument).\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|--------------|-----------------------------------------------------------------------------------------------|\n| `ValueError` | If not initialized with a [`tf.keras.layers.Layer`](../../../tf/keras/layers/Layer) instance. |\n\n\u003cbr /\u003e"]]