tf.compat.v1.nn.rnn_cell.DropoutWrapper

Operator adding dropout to inputs and outputs of the given cell.

Inherits From: RNNCell, Layer, Layer, Module

cell an RNNCell, a projection to output_size is added to it.
input_keep_prob unit Tensor or float between 0 and 1, input keep probability; if it is constant and 1, no input dropout will be added.
output_keep_prob unit Tensor or float between 0 and 1, output keep probability; if it is constant and 1, no output dropout will be added.
state_keep_prob unit Tensor or float between 0 and 1, output keep probability; if it is constant and 1, no output dropout will be added. State dropout is performed on the outgoing states of the cell. Note the state components to which dropout is applied when state_keep_prob is in (0, 1) are also determined by the argument dropout_state_filter_visitor (e.g. by default dropout is never applied to the c component of an LSTMStateTuple).
variational_recurrent Python bool. If True, then the same dropout pattern is applied across all time steps per run call. If this parameter is set, input_size must be provided.
input_size (optional) (possibly nested tuple of) TensorShape objects containing the depth(s) of the input tensors expected to be passed in to the DropoutWrapper. Required and used iff variational_recurrent = True and input_keep_prob < 1.
dtype (optional) The dtype of the input, state, and output tensors. Required and used iff variational_recurrent = True.
seed (optional) integer, the randomness seed.
dropout_state_filter_visitor (optional), default: (see below). Function that takes any hierarchical level of the state and returns a scalar or depth=1 structure of Python booleans describing which terms in the state should be dropped out. In addition, if the function returns True, dropout is applied across this sublevel. If the function returns False, dropout is not applied across this entire sublevel. Default behavior: perform dropout on all terms except the memory (c) state of LSTMCellState objects, and don't try to apply dropout to TensorArray objects: def dropout_state_filter_visitor(s): if isinstance(s, LSTMCellState): # Never perform dropout on the c state. return LSTMCellState(c=False, h=True) elif isinstance(s, TensorArray): return False return True
**kwargs dict of keyword arguments for base layer.

TypeError if cell is not an RNNCell, or keep_state_fn is provided but not callable.
ValueError if any of the keep_probs are not between 0 and 1.

graph

output_size Integer or TensorShape: size of outputs produced by this cell.
scope_name

state_size size(s) of state(s) used by this cell.

It can be represented by an Integer, a TensorShape or a tuple of Integers or TensorShapes.

wrapped_cell

Methods

apply

View source

get_initial_state

View source

get_losses_for

View source

Retrieves losses relevant to a specific set of inputs.

Args
inputs Input tensor or list/tuple of input tensors.

Returns
List of loss tensors of the layer that depend on inputs.

get_updates_for

View source

Retrieves updates relevant to a specific set of inputs.

Args
inputs Input tensor or list/tuple of input tensors.

Returns
List of update ops of the layer that depend on inputs.

zero_state

View source

Return zero-filled state tensor(s).

Args
batch_size int, float, or unit Tensor representing the batch size.
dtype the data type to use for the state.

Returns
If state_size is an int or TensorShape, then the return value is a N-D tensor of shape [batch_size, state_size] filled with zeros.

If state_size is a nested list or tuple, then the return value is a nested list or tuple (of the same structure) of 2-D tensors with the shapes [batch_size, s] for each s in state_size.