Bidirectional wrapper for RNNs.
Inherits From: Wrapper, Layer, Module
tf.keras.layers.Bidirectional(
    layer,
    merge_mode='concat',
    weights=None,
    backward_layer=None,
    **kwargs
)
| Args | 
|---|
| layer | keras.layers.RNNinstance, such askeras.layers.LSTMorkeras.layers.GRU. It could also be akeras.layers.Layerinstance
that meets the following criteria:
Be a sequence-processing layer (accepts 3D+ inputs).Have a go_backwards,return_sequencesandreturn_stateattribute (with the same semantics as for theRNNclass).Have an input_specattribute.Implement serialization via get_config()andfrom_config().
Note that the recommended way to create new RNN layers is to write a
custom RNN cell and use it withkeras.layers.RNN, instead of
subclassingkeras.layers.Layerdirectly.When the returns_sequencesis true, the output of the masked timestep
will be zero regardless of the layer's originalzero_output_for_maskvalue. | 
| merge_mode | Mode by which outputs of the forward and backward RNNs will be
combined. One of {'sum', 'mul', 'concat', 'ave', None}. If None, the
outputs will not be combined, they will be returned as a list. Default
value is 'concat'. | 
| backward_layer | Optional keras.layers.RNN, orkeras.layers.Layerinstance to be used to handle backwards input processing.
Ifbackward_layeris not provided, the layer instance passed as thelayerargument will be used to generate the backward layer
automatically.
Note that the providedbackward_layerlayer should have properties
matching those of thelayerargument, in particular it should have the
same values forstateful,return_states,return_sequences, etc.
In addition,backward_layerandlayershould have differentgo_backwardsargument values.
AValueErrorwill be raised if these requirements are not met. | 
| Call arguments | 
|---|
| The call arguments for this layer are the same as those of the wrapped RNN
  layer.
Beware that when passing the initial_stateargument during the call of
this layer, the first half in the list of elements in theinitial_statelist will be passed to the forward RNN call and the last half in the list
of elements will be passed to the backward RNN call. | 
| Raises | 
|---|
| ValueError | 
If layerorbackward_layeris not aLayerinstance.In case of invalid merge_modeargument.If backward_layerhas mismatched properties compared tolayer. | 
Examples:
model = Sequential()
model.add(Bidirectional(LSTM(10, return_sequences=True), input_shape=(5, 10)))
model.add(Bidirectional(LSTM(10)))
model.add(Dense(5))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
 # With custom backward layer
 model = Sequential()
 forward_layer = LSTM(10, return_sequences=True)
 backward_layer = LSTM(10, activation='relu', return_sequences=True,
                       go_backwards=True)
 model.add(Bidirectional(forward_layer, backward_layer=backward_layer,
                         input_shape=(5, 10)))
 model.add(Dense(5))
 model.add(Activation('softmax'))
 model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
Methods
reset_states
View source
reset_states()