|View source on GitHub|
Computes softmax cross entropy between
tf.contrib.nn.deprecated_flipped_softmax_cross_entropy_with_logits( logits, labels, dim=-1, name=None )
This function diffs from tf.nn.softmax_cross_entropy_with_logits only in the argument order.
Measures the probability error in discrete classification tasks in which the classes are mutually exclusive (each entry is in exactly one class). For example, each CIFAR-10 image is labeled with one and only one label: an image can be a dog or a truck, but not both.
NOTE: While the classes are mutually exclusive, their probabilities
need not be. All that is required is that each row of
a valid probability distribution. If they are not, the computation of the
gradient will be incorrect.
If using exclusive
labels (wherein one and only
one class is true at a time), see
WARNING: This op expects unscaled logits, since it performs a
logits internally for efficiency. Do not call this op with the
softmax, as it will produce incorrect results.
labels must have the same shape
and the same dtype (either
logits: Unscaled log probabilities.
labels: Each row
labels[i]must be a valid probability distribution.
dim: The class dimension. Defaulted to -1 which is the last dimension.
name: A name for the operation (optional).
Tensor of length
batch_size of the same type as
logits with the
softmax cross entropy loss.