This function diffs from tf.nn.softmax_cross_entropy_with_logits only in the
argument order.
Measures the probability error in discrete classification tasks in which the
classes are mutually exclusive (each entry is in exactly one class). For
example, each CIFAR-10 image is labeled with one and only one label: an image
can be a dog or a truck, but not both.
If using exclusive labels (wherein one and only
one class is true at a time), see sparse_softmax_cross_entropy_with_logits.
logits and labels must have the same shape [batch_size, num_classes]
and the same dtype (either float16, float32, or float64).
Args
logits
Unscaled log probabilities.
labels
Each row labels[i] must be a valid probability distribution.
dim
The class dimension. Defaulted to -1 which is the last dimension.
name
A name for the operation (optional).
Returns
A 1-D Tensor of length batch_size of the same type as logits with the
softmax cross entropy loss.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2020-10-01 UTC."],[],[]]