ML Community Day is November 9! Join us for updates from TensorFlow, JAX, and more Learn more

tf.keras.layers.LSTMCell

Cell class for the LSTM layer.

Inherits From: LSTMCell, Layer, Module

Used in the notebooks

Used in the guide Used in the tutorials

See the Keras RNN API guide for details about the usage of RNN API.

This class processes one step within the whole time sequence input, whereas tf.keras.layer.LSTM processes the whole sequence.

For example:

inputs = tf.random.normal([32, 10, 8])
rnn = tf.keras.layers.RNN(tf.keras.layers.LSTMCell(4))
output = rnn(inputs)
print(output.shape)
(32, 4)
rnn = tf.keras.layers.RNN(
   tf.keras.layers.LSTMCell(4),
   return_sequences=True,
   return_state=True)
whole_seq_output, final_memory_state, final_carry_state = rnn(inputs)
print(whole_seq_output.shape)
(32, 10, 4)
print(final_memory_state.shape)
(32, 4)
print(final_carry_state.shape)
(32, 4)

units Positive integer, dimensionality of the output space.
activation Activation function to use. Default: hyperbolic tangent (tanh). If you pass None, no activation is applied (ie. "linear" activation: a(x) = x).
recurrent_activation Activation function to use for the recurrent step. Default: sigmoid (sigmoid). If you pass None, no activation is applied (ie. "linear" activation: a(x) = x).
use_bias Boolean, (default True), whether the layer uses a bias vector.
kernel_initializer Initializer for the kernel weights matrix, used for the linear transformation of the inputs. Default: glorot_uniform.
recurrent_initializer Initializer for the