Missed TensorFlow Dev Summit? Check out the video playlist. Watch recordings

tfa.rnn.LayerNormLSTMCell

View source on GitHub

Class LayerNormLSTMCell

LSTM cell with layer normalization and recurrent dropout.

This class adds layer normalization and recurrent dropout to a LSTM unit. Layer normalization implementation is based on:

https://arxiv.org/abs/1607.06450.

"Layer Normalization" Jimmy Lei Ba, Jamie Ryan Kiros, Geoffrey E. Hinton

and is applied before the internal nonlinearities. Recurrent dropout is based on:

https://arxiv.org/abs/1603.05118

"Recurrent Dropout without Memory Loss" Stanislau Semeniuta, Aliaksei Severyn, Erhardt Barth.