Registration is open for TensorFlow Dev Summit 2020 Learn more


View source on GitHub

Class LayerNormLSTMCell

LSTM cell with layer normalization and recurrent dropout.

Aliases: tfa.rnn.cell.LayerNormLSTMCell

This class adds layer normalization and recurrent dropout to a LSTM unit. Layer normalization implementation is based on:

"Layer Normalization" Jimmy Lei Ba, Jamie Ryan Kiros, Geoffrey E. Hinton

and is applied before the internal nonlinearities. Recurrent dropout is based on:

"Recurrent Dropout without Memory Loss" Stanislau Semeniuta, Aliaksei Severyn, Erhardt Barth.