tf.keras.layers.Bidirectional
Stay organized with collections
Save and categorize content based on your preferences.
Bidirectional wrapper for RNNs.
Inherits From: Wrapper
tf.keras.layers.Bidirectional(
layer, merge_mode='concat', weights=None, backward_layer=None, **kwargs
)
Arguments |
layer
|
Recurrent instance.
|
merge_mode
|
Mode by which outputs of the
forward and backward RNNs will be combined.
One of {'sum', 'mul', 'concat', 'ave', None}.
If None, the outputs will not be combined,
they will be returned as a list.
|
backward_layer
|
Optional Recurrent instance to be used to handle
backwards input processing. If backward_layer is not provided,
the layer instance passed as the layer argument will be used to
generate the backward layer automatically.
Note that the provided backward_layer layer should have properties
matching those of the layer argument, in particular it should have the
same values for stateful , return_states , return_sequence , etc.
In addition, backward_layer and layer should have
different go_backwards argument values.
A ValueError will be raised if these requirements are not met.
|
Call arguments:
The call arguments for this layer are the same as those of the wrapped RNN
layer.
Raises |
ValueError
|
- If
layer or backward_layer is not a Layer instance.
- In case of invalid
merge_mode argument.
- If
backward_layer has mismatched properties compared to layer .
|
Examples:
model = Sequential()
model.add(Bidirectional(LSTM(10, return_sequences=True), input_shape=(5, 10)))
model.add(Bidirectional(LSTM(10)))
model.add(Dense(5))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
# With custom backward layer
model = Sequential()
forward_layer = LSTM(10, return_sequences=True)
backward_layer = LSTM(10, activation='relu', return_sequences=True,
go_backwards=True)
model.add(Bidirectional(forward_layer, backward_layer=backward_layer,
input_shape=(5, 10)))
model.add(Dense(5))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
Methods
reset_states
View source
reset_states()
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.keras.layers.Bidirectional\n\n\u003cbr /\u003e\n\n|---------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------|\n| [TensorFlow 1 version](/versions/r1.15/api_docs/python/tf/keras/layers/Bidirectional) | [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.1.0/tensorflow/python/keras/layers/wrappers.py#L353-L757) |\n\nBidirectional wrapper for RNNs.\n\nInherits From: [`Wrapper`](../../../tf/keras/layers/Wrapper)\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.keras.layers.Bidirectional`](/api_docs/python/tf/keras/layers/Bidirectional)\n\n\u003cbr /\u003e\n\n tf.keras.layers.Bidirectional(\n layer, merge_mode='concat', weights=None, backward_layer=None, **kwargs\n )\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Arguments --------- ||\n|------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `layer` | `Recurrent` instance. |\n| `merge_mode` | Mode by which outputs of the forward and backward RNNs will be combined. One of {'sum', 'mul', 'concat', 'ave', None}. If None, the outputs will not be combined, they will be returned as a list. |\n| `backward_layer` | Optional `Recurrent` instance to be used to handle backwards input processing. If `backward_layer` is not provided, the layer instance passed as the `layer` argument will be used to generate the backward layer automatically. Note that the provided `backward_layer` layer should have properties matching those of the `layer` argument, in particular it should have the same values for `stateful`, `return_states`, `return_sequence`, etc. In addition, `backward_layer` and `layer` should have different `go_backwards` argument values. A `ValueError` will be raised if these requirements are not met. |\n\n\u003cbr /\u003e\n\n#### Call arguments:\n\nThe call arguments for this layer are the same as those of the wrapped RNN\nlayer.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|--------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `ValueError` | \u003cbr /\u003e 1. If `layer` or `backward_layer` is not a `Layer` instance. 2. In case of invalid `merge_mode` argument. 3. If `backward_layer` has mismatched properties compared to `layer`. |\n\n#### Examples:\n\n model = Sequential()\n model.add(Bidirectional(LSTM(10, return_sequences=True), input_shape=(5, 10)))\n model.add(Bidirectional(LSTM(10)))\n model.add(Dense(5))\n model.add(Activation('softmax'))\n model.compile(loss='categorical_crossentropy', optimizer='rmsprop')\n\n # With custom backward layer\n model = Sequential()\n forward_layer = LSTM(10, return_sequences=True)\n backward_layer = LSTM(10, activation='relu', return_sequences=True,\n go_backwards=True)\n model.add(Bidirectional(forward_layer, backward_layer=backward_layer,\n input_shape=(5, 10)))\n model.add(Dense(5))\n model.add(Activation('softmax'))\n model.compile(loss='categorical_crossentropy', optimizer='rmsprop')\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Attributes ---------- ||\n|---------------|---------------|\n| `constraints` | \u003cbr /\u003e \u003cbr /\u003e |\n\n\u003cbr /\u003e\n\nMethods\n-------\n\n### `reset_states`\n\n[View source](https://github.com/tensorflow/tensorflow/blob/v2.1.0/tensorflow/python/keras/layers/wrappers.py#L691-L693) \n\n reset_states()"]]