Registration is open for TensorFlow Dev Summit 2020 Learn more

tfa.rnn.LayerNormLSTMCell

View source on GitHub

Class LayerNormLSTMCell

LSTM cell with layer normalization and recurrent dropout.

Aliases: tfa.rnn.cell.LayerNormLSTMCell

This class adds layer normalization and recurrent dropout to a LSTM unit. Layer normalization implementation is based on:

https://arxiv.org/abs/1607.06450.

"Layer Normalization" Jimmy Lei Ba, Jamie Ryan Kiros, Geoffrey E. Hinton

and is applied before the internal nonlinearities. Recurrent dropout is based on:

https://arxiv.org/abs/1603.05118

"Recurrent Dropout without Memory Loss" Stanislau Semeniuta, Aliaksei Severyn, Erhardt Barth.