|View source on GitHub|
Weighted cross-entropy loss for a sequence of logits, batch-collapsed.
tf.contrib.legacy_seq2seq.sequence_loss( logits, targets, weights, average_across_timesteps=True, average_across_batch=True, softmax_loss_function=None, name=None )
logits: List of 2D Tensors of shape [batch_size x num_decoder_symbols].
targets: List of 1D batch-sized int32 Tensors of the same length as logits.
weights: List of 1D batch-sized float-Tensors of the same length as logits.
average_across_timesteps: If set, divide the returned cost by the total label weight.
average_across_batch: If set, divide the returned cost by the batch size.
softmax_loss_function: Function (labels, logits) -> loss-batch to be used instead of the standard softmax (the default if this is None). Note that to avoid confusion, it is required for the function to accept named arguments.
name: Optional name for this operation, defaults to "sequence_loss".
A scalar float Tensor: The average log-perplexity per symbol (weighted).
ValueError: If len(logits) is different from len(targets) or len(weights).