|View source on GitHub|
Computes softmax cross entropy between
labels. (deprecated arguments)
tf.compat.v1.nn.softmax_cross_entropy_with_logits_v2( labels, logits, axis=None, name=None, dim=None )
Used in the notebooks
|Used in the tutorials|
Measures the probability error in discrete classification tasks in which the classes are mutually exclusive (each entry is in exactly one class). For example, each CIFAR-10 image is labeled with one and only one label: an image can be a dog or a truck, but not both.
If using exclusive
labels (wherein one and only
one class is true at a time), see
A common use case is to have logits and labels of shape
[batch_size, num_classes], but higher dimensions are supported, with
axis argument specifying the class dimension.
labels must have the same dtype (either
Backpropagation will happen into bot