tf.contrib.rnn.TimeFreqLSTMCell

View source on GitHub

Time-Frequency Long short-term memory unit (LSTM) recurrent network cell.

Inherits From: RNNCell

This implementation is based on:

Tara N. Sainath and Bo Li "Modeling Time-Frequency Patterns with LSTM vs. Convolutional Architectures for LVCSR Tasks." submitted to INTERSPEECH, 2016.

It uses peep-hole connections and optional cell clipping.

num_units int, The number of units in the LSTM cell
use_peepholes bool, set True to enable diagonal/peephole connections.
cell_clip (optional) A float value, if provided the cell state is clipped by this value prior to the cell output activation.
initializer (optional) The initializer to use for the weight and projection matrices.
num_unit_shards int, How to split the weight matrix. If >1, the weight matrix is stored across num_unit_shards.
forget_bias float, Biases of the forget gate are initialized by default to 1 in order to reduce the scale of forgetting at the beginning of the training.
feature_size int, The size of the input feature the LSTM spans over.
frequency_skip int, The amount the LSTM filter is shifted by in frequency.
reuse (optional) Python boolean describing whether to reuse variables in an existing scope. If not True, and the existing scope already has the given variables, an error is raised.

graph DEPRECATED FUNCTION

output_size Integer or TensorShape: size of outputs produced by this cell.
scope_name

state_size size(s) of state(s) used by this cell.

It can be represented by an Integer, a TensorShape or a tuple of Integers or TensorShapes.

Methods

get_initial_state

View source

zero_state

View source

Return zero-filled state tensor(s).

Args
batch_size int, float, or unit Tensor representing the batch size.
dtype the data type to use for the state.

Returns
If state_size is an int or TensorShape, then the return value is a N-D tensor of shape [batch_size, state_size] filled with zeros.

If state_size is a nested list or tuple, then the return value is a nested list or tuple (of the same structure) of 2-D tensors with the shapes [batch_size, s] for each s in state_size.