tf.keras.layers.LSTMCell

Cell class for the LSTM layer.

Inherits From: LSTMCell, Layer, Module

Used in the notebooks

Used in the guide Used in the tutorials

See the Keras RNN API guide for details about the usage of RNN API.

This class processes one step within the whole time sequence input, whereas tf.keras.layer.LSTM processes the whole sequence.

For example:

inputs = tf.random.normal([32, 10, 8])
rnn = tf.keras.layers.RNN(tf.keras.layers.LSTMCell(4))
output = rnn(inputs)
print(output.shape)
(32, 4)
rnn = tf.keras.layers.RNN(
   tf.keras.layers.LSTMCell(4),
   return_sequences=True,
   return_state=True)
whole_seq_output, final_memory_state, final_carry_state = rnn(inputs)
print(whole_seq_output.shape)
(32, 10, 4)
print(final_memory_state.shape)
(32, 4)
print(final_carry_state.shape)
(32, 4)

units Positive integer, dimensionality of the output space.
activation Activation function to use. Default: hyperbolic tangent (tanh). If you pass None, no activation is applied (ie. "linear" activation: a(x) = x).
recurrent_activation Activation function to use for the recurrent step. Default: sigmoid (sigmoid). If you pass None, no activation is applied (ie. "linear" activation: a(x) = x).
use_bias Boolean, (default True), whether the layer uses a bias vector.
kernel_initializer Initializer for the kernel weights matrix, used for the linear transformation of the inputs. Default: glorot_uniform.
recurrent_initializer Initializer for the recurrent_kernel weights matrix, used for the linear transformation of the recurrent state. Default: orthogonal.
bias_initializer Initializer for the bias vector. Default: zeros.
unit_forget_bias Boolean (default True). If True, add 1 to the bias of the forget gate at initialization. Setting it to true will also force bias_initializer="zeros". This is recommended in Jozefowicz et al.
kernel_regularizer Regularizer function applied to the kernel weights matrix. Default: None.
recurrent_regularizer Regularizer function applied to the recurrent_kernel weights matrix. Default: None.
bias_regularizer Regularizer function applied to the bias vector. Default: None.
kernel_constraint Constraint function applied to the kernel weights matrix. Default: None.
recurrent_constraint Constraint function applied to the recurrent_kernel weights matrix. Default: None.
bias_constraint Constraint function applied to the bias vector. Default: None.
dropout Float between 0 and 1. Fraction of the units to drop for the linear transformation of the inputs. Default: 0.
recurrent_dropout Float between 0 and 1. Fraction of the units to drop for the linear transformation of the recurrent state. Default: 0.

Call arguments:

  • inputs: A 2D tensor, with shape of [batch, feature].
  • states: List of 2 tensors that corresponding to the cell's units. Both of them have shape [batch, units], the first tensor is the memory state from previous time step, the second tensor is the carry state from previous time step. For timestep 0, the initial state provided by user will be feed to cell.
  • training: Python boolean indicating whether the layer should behave in training mode or in inference mode. Only relevant when dropout or recurrent_dropout is used.

Methods

get_dropout_mask_for_cell

View source

Get the dropout mask for RNN cell's input.

It will create mask based on context if there isn't any existing cached mask. If a new mask is generated, it will update the cache in the cell.

Args
inputs The input t