tf.compat.v1.keras.layers.LSTM

Long Short-Term Memory layer - Hochreiter 1997.

Inherits From: RNN

Note that this cell is not optimized for performance on GPU. Please use tf.compat.v1.keras.layers.CuDNNLSTM for better performance on GPU.

units Positive integer, dimensionality of the output space.
activation Activation function to use. Default: hyperbolic tangent (tanh). If you pass None, no activation is applied (ie. "linear" activation: a(x) = x).
recurrent_activation Activation function to use for the recurrent step. Default: hard sigmoid (hard_sigmoid). If you pass None, no activation is applied (ie. "linear" activation: a(x) = x).
use_bias Boolean, whether the layer uses a bias vector.
kernel_initializer Initializer for the kernel weights matrix, used for the linear transformation of the inputs..
recurrent_initializer Initializer for the recurrent_kernel weights matrix, used for the linear transformation of the recurrent state.
bias_initializer Initializer for the bias vector.
unit_forget_bias Boolean. If True, add 1 to the bias of the forget gate at initialization. Setting it to true will also force bias_initializer="zeros". This is recommended in Jozefowicz et al., 2015.
kernel_regularizer Regularizer function applied to the kernel weights matrix.
recurrent_regularizer Regularizer function applied to the recurrent_kernel weights matrix.
bias_regularizer Regularizer function applied to the bias vector.
activity_regularizer Regularizer function applied to the output of the layer (its "activation")..
kernel_constraint Constraint function applied to the kernel weights matrix.
recurrent_constraint Constraint function applied to the recurrent_kernel weights matrix.
bias_constraint Constraint function applied to the bias vector.
dropout Float between 0 and 1. Fraction of the units to drop for the lin