|View source on GitHub|
Computes sparse softmax cross entropy between
tf.compat.v1.nn.sparse_softmax_cross_entropy_with_logits( _sentinel=None, labels=None, logits=None, name=None )
Measures the probability error in discrete classification tasks in which the classes are mutually exclusive (each entry is in exactly one class). For example, each CIFAR-10 image is labeled with one and only one label: an image can be a dog or a truck, but not both.
A common use case is to have logits of shape
[batch_size, num_classes] and have labels of shape
[batch_size], but higher dimensions are supported, in which
dim-th dimension is assumed to be of size
logits must have the dtype of
labels must have the dtype of
Note that to avoid confusion, it is required to pass only named arguments to this function.
||Used to prevent positional parameters. Internal, do not use.|
Per-label activations (typically a linear output) of shape
||A name for the operation (optional).|
||If logits are scalars (need to have rank >= 1) or if the rank of the labels is not equal to the rank of the logits minus one.|