tf.losses.softmax_cross_entropy(onehot_labels, logits, weights=1.0, label_smoothing=0, scope=None, loss_collection=tf.GraphKeys.LOSSES)
Creates a cross-entropy loss using tf.nn.softmax_cross_entropy_with_logits.
weights acts as a coefficient for the loss. If a scalar is provided,
then the loss is simply scaled by the given value. If
weights is a
tensor of shape
[batch_size], then the loss weights apply to each
label_smoothing is nonzero, smooth the labels towards 1/num_classes:
new_onehot_labels = onehot_labels * (1 - label_smoothing)
+ label_smoothing / num_classes
[batch_size, num_classes]target one-hot-encoded labels.
logits: [batch_size, num_classes] logits outputs of the network .
Tensorwhose rank is either 0, or the same rank as
onehot_labels, and must be broadcastable to
onehot_labels(i.e., all dimensions must be either
1, or the same as the corresponding
label_smoothing: If greater than 0 then smooth the labels.
scope: the scope for the operations performed in computing the loss.
loss_collection: collection to which the loss will be added.
Tensor representing the mean loss value.
ValueError: If the shape of
logitsdoesn't match that of
onehot_labelsor if the shape of
weightsis invalid or if