|View source on GitHub|
LSTM cell with layer normalization and recurrent dropout.
This class adds layer normalization and recurrent dropout to a LSTM unit. Layer normalization implementation is based on:
"Layer Normalization" Jimmy Lei Ba, Jamie Ryan Kiros, Geoffrey E. Hinton
and is applied before the internal nonlinearities. Recurrent dropout is based on:
"Recurrent Dropout without Memory Loss" Stanislau Semeniuta, Aliaksei Severyn, Erhardt Barth.