{ }
View source on GitHub |
Computes the weighted loss.
tf.compat.v1.losses.compute_weighted_loss(
losses,
weights=1.0,
scope=None,
loss_collection=ops.GraphKeys.LOSSES,
reduction=Reduction.SUM_BY_NONZERO_WEIGHTS
)
Returns | |
---|---|
Weighted loss Tensor of the same type as losses . If reduction is
NONE , this has the same shape as losses ; otherwise, it is scalar.
|
Raises | |
---|---|
ValueError
|
If weights is None or the shape is not compatible with
losses , or if the number of dimensions (rank) of either losses or
weights is missing.
|
Note | |
---|---|
When calculating the gradient of a weighted loss contributions from
both losses and weights are considered. If your weights depend
on some model parameters but you do not want this to affect the loss
gradient, you need to apply tf.stop_gradient to weights before
passing them to compute_weighted_loss .
|
eager compatibility
The loss_collection
argument is ignored when executing eagerly. Consider
holding on to the return value or collecting losses via a tf.keras.Model
.