|View source on GitHub|
Creates a cross-entropy loss using tf.nn.sigmoid_cross_entropy_with_logits.
tf.compat.v1.losses.sigmoid_cross_entropy( multi_class_labels, logits, weights=1.0, label_smoothing=0, scope=None, loss_collection=tf.GraphKeys.LOSSES, reduction=Reduction.SUM_BY_NONZERO_WEIGHTS )
weights acts as a coefficient for the loss. If a scalar is provided,
then the loss is simply scaled by the given value. If
weights is a
tensor of shape
[batch_size], then the loss weights apply to each
label_smoothing is nonzero, smooth the labels towards 1/2:
new_multiclass_labels = multiclass_labels * (1 - label_smoothing) + 0.5 * label_smoothing
[batch_size, num_classes]target integer labels in
[batch_size, num_classes]logits outputs of the network.
Tensorwhose rank is either 0, or the same rank as
labels, and must be broadcastable to
labels(i.e., all dimensions must be either
1, or the same as the corresponding
label_smoothing: If greater than
0then smooth the labels.
scope: The scope for the operations performed in computing the loss.
loss_collection: collection to which the loss will be added.
reduction: Type of reduction to apply to loss.
Tensor of the same type as
NONE, this has the same shape as
logits; otherwise, it is scalar.
ValueError: If the shape of
logitsdoesn't match that of
multi_class_labelsor if the shape of
weightsis invalid, or if
weightsis None. Also if
loss_collection argument is ignored when executing eagerly. Consider
holding on to the return value or collecting losses via a