tf.keras.layers.Bidirectional
Stay organized with collections
Save and categorize content based on your preferences.
Bidirectional wrapper for RNNs.
Inherits From: Wrapper
tf.keras.layers.Bidirectional(
layer, merge_mode='concat', weights=None, backward_layer=None, **kwargs
)
Arguments |
layer
|
keras.layers.RNN instance, such as keras.layers.LSTM or
keras.layers.GRU . It could also be a keras.layers.Layer instance
that meets the following criteria:
- Be a sequence-processing layer (accepts 3D+ inputs).
- Have a
go_backwards , return_sequences and return_state
attribute (with the same semantics as for the RNN class).
- Have an
input_spec attribute.
- Implement serialization via
get_config() and from_config() .
Note that the recommended way to create new RNN layers is to write a
custom RNN cell and use it with keras.layers.RNN , instead of
subclassing keras.layers.Layer directly.
|
merge_mode
|
Mode by which outputs of the forward and backward RNNs will be
combined. One of {'sum', 'mul', 'concat', 'ave', None}. If None, the
outputs will not be combined, they will be returned as a list. Default
value is 'concat'.
|
backward_layer
|
Optional keras.layers.RNN , or keras.layers.Layerinstance
to be used to handle backwards input processing. If backward_layeris
not provided, the layer instance passed as the layerargument will be
used to generate the backward layer automatically.
Note that the provided backward_layerlayer should have properties
matching those of the layerargument, in particular it should have the
same values for stateful, return_states, return_sequence, etc.
In addition, backward_layerand layershould have different go_backwardsargument values.
A ValueError` will be raised if these requirements are not met.
|
Call arguments:
The call arguments for this layer are the same as those of the wrapped RNN
layer.
Raises |
ValueError
|
- If
layer or backward_layer is not a Layer instance.
- In case of invalid
merge_mode argument.
- If
backward_layer has mismatched properties compared to layer .
|
Examples:
model = Sequential()
model.add(Bidirectional(LSTM(10, return_sequences=True), input_shape=(5, 10)))
model.add(Bidirectional(LSTM(10)))
model.add(Dense(5))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
# With custom backward layer
model = Sequential()
forward_layer = LSTM(10, return_sequences=True)
backward_layer = LSTM(10, activation='relu', return_sequences=True,
go_backwards=True)
model.add(Bidirectional(forward_layer, backward_layer=backward_layer,
input_shape=(5, 10)))
model.add(Dense(5))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
Methods
reset_states
View source
reset_states()
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.keras.layers.Bidirectional\n\n\u003cbr /\u003e\n\n|---------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------|\n| [TensorFlow 1 version](/versions/r1.15/api_docs/python/tf/keras/layers/Bidirectional) | [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.2.0/tensorflow/python/keras/layers/wrappers.py#L331-L745) |\n\nBidirectional wrapper for RNNs.\n\nInherits From: [`Wrapper`](../../../tf/keras/layers/Wrapper)\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.keras.layers.Bidirectional`](/api_docs/python/tf/keras/layers/Bidirectional)\n\n\u003cbr /\u003e\n\n tf.keras.layers.Bidirectional(\n layer, merge_mode='concat', weights=None, backward_layer=None, **kwargs\n )\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Arguments --------- ||\n|------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `layer` | [`keras.layers.RNN`](../../../tf/keras/layers/RNN) instance, such as [`keras.layers.LSTM`](../../../tf/keras/layers/LSTM) or [`keras.layers.GRU`](../../../tf/keras/layers/GRU). It could also be a [`keras.layers.Layer`](../../../tf/keras/layers/Layer) instance that meets the following criteria: \u003cbr /\u003e 1. Be a sequence-processing layer (accepts 3D+ inputs). 2. Have a `go_backwards`, `return_sequences` and `return_state` attribute (with the same semantics as for the `RNN` class). 3. Have an `input_spec` attribute. 4. Implement serialization via `get_config()` and `from_config()`. Note that the recommended way to create new RNN layers is to write a custom RNN cell and use it with [`keras.layers.RNN`](../../../tf/keras/layers/RNN), instead of subclassing [`keras.layers.Layer`](../../../tf/keras/layers/Layer) directly. |\n| `merge_mode` | Mode by which outputs of the forward and backward RNNs will be combined. One of {'sum', 'mul', 'concat', 'ave', None}. If None, the outputs will not be combined, they will be returned as a list. Default value is 'concat'. |\n| `backward_layer` | Optional [`keras.layers.RNN`](../../../tf/keras/layers/RNN), or keras.layers.Layer`instance to be used to handle backwards input processing. If`backward_layer`is not provided, the layer instance passed as the`layer`argument will be used to generate the backward layer automatically. Note that the provided`backward_layer`layer should have properties matching those of the`layer`argument, in particular it should have the same values for`stateful`,`return_states`,`return_sequence`, etc. In addition,`backward_layer`and`layer`should have different`go_backwards`argument values. A`ValueError\\` will be raised if these requirements are not met. |\n\n#### Call arguments:\n\nThe call arguments for this layer are the same as those of the wrapped RNN\nlayer.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|--------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `ValueError` | \u003cbr /\u003e 1. If `layer` or `backward_layer` is not a `Layer` instance. 2. In case of invalid `merge_mode` argument. 3. If `backward_layer` has mismatched properties compared to `layer`. |\n\n#### Examples:\n\n model = Sequential()\n model.add(Bidirectional(LSTM(10, return_sequences=True), input_shape=(5, 10)))\n model.add(Bidirectional(LSTM(10)))\n model.add(Dense(5))\n model.add(Activation('softmax'))\n model.compile(loss='categorical_crossentropy', optimizer='rmsprop')\n\n # With custom backward layer\n model = Sequential()\n forward_layer = LSTM(10, return_sequences=True)\n backward_layer = LSTM(10, activation='relu', return_sequences=True,\n go_backwards=True)\n model.add(Bidirectional(forward_layer, backward_layer=backward_layer,\n input_shape=(5, 10)))\n model.add(Dense(5))\n model.add(Activation('softmax'))\n model.compile(loss='categorical_crossentropy', optimizer='rmsprop')\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Attributes ---------- ||\n|---------------|---------------|\n| `constraints` | \u003cbr /\u003e \u003cbr /\u003e |\n\n\u003cbr /\u003e\n\nMethods\n-------\n\n### `reset_states`\n\n[View source](https://github.com/tensorflow/tensorflow/blob/v2.2.0/tensorflow/python/keras/layers/wrappers.py#L679-L681) \n\n reset_states()"]]