tf.compat.v1.nn.rnn_cell.GRUCell

Gated Recurrent Unit cell.

Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnGRU for better performance on GPU, or tf.contrib.rnn.GRUBlockCellV2 for better performance on CPU.

num_units int, The number of units in the GRU cell.
activation Nonlinearity to use. Default: tanh.
reuse (optional) Python boolean describing whether to reuse variables in an existing scope. If not True, and the existing scope already has the given variables, an error is raised.
kernel_initializer (optional) The initializer to use for the weight and projection matrices.
bias_initializer (optional) The initializer to use for the bias.
name String, the name of the layer. Layers with the same name will share weights, but to avoid mistakes we require reuse=True in such cases.
dtype