|View source on GitHub|
Creates a cross-entropy loss using tf.nn.softmax_cross_entropy_with_logits_v2.
tf.losses.softmax_cross_entropy( onehot_labels, logits, weights=1.0, label_smoothing=0, scope=None, loss_collection=tf.GraphKeys.LOSSES, reduction=Reduction.SUM_BY_NONZERO_WEIGHTS )
weights acts as a coefficient for the loss. If a scalar is provided,
then the loss is simply scaled by the given value. If
weights is a
tensor of shape
[batch_size], then the loss weights apply to each
label_smoothing is nonzero, smooth the labels towards 1/num_classes:
new_onehot_labels = onehot_labels * (1 - label_smoothing)
+ label_smoothing / num_classes
logits must have the same shape,
[batch_size, num_classes]. The shape of
weights must be
broadcastable to loss, whose shape is decided by the shape of
In case the shape of
[batch_size, num_classes], loss is
Tensor of shape
onehot_labels: One-hot-encoded labels.
logits: Logits outputs of the network.
Tensorthat is broadcastable to loss.
label_smoothing: If greater than 0 then smooth the labels.
scope: the scope for the operations performed in computing the loss.
loss_collection: collection to which the loss will be added.
reduction: Type of reduction to apply to loss.
Tensor of the same type as
NONE, this has shape
[batch_size]; otherwise, it is scalar.
ValueError: If the shape of
logitsdoesn't match that of
onehot_labelsor if the shape of
weightsis invalid or if
weightsis None. Also if
loss_collection argument is ignored when executing eagerly. Consider
holding on to the return value or collecting losses via a