tf.losses.compute_weighted_loss( losses, weights=1.0, scope=None, loss_collection=tf.GraphKeys.LOSSES, reduction=Reduction.SUM_BY_NONZERO_WEIGHTS )
Computes the weighted loss.
[batch_size, d1, ... dN].
Tensorwhose rank is either 0, or the same rank as
losses, and must be broadcastable to
losses(i.e., all dimensions must be either
1, or the same as the corresponding
scope: the scope for the operations performed in computing the loss.
loss_collection: the loss will be added to these collections.
reduction: Type of reduction to apply to loss.
Tensor of the same type as
NONE, this has the same shape as
losses; otherwise, it is scalar.
Noneor the shape is not compatible with
losses, or if the number of dimensions (rank) of either
loss_collection argument is ignored when executing eagerly. Consider
holding on to the return value or collecting losses via a