tf.keras.losses.KLDivergence

Computes Kullback-Leibler divergence loss between y_true & y_pred.

Inherits From: Loss

Formula:

loss = y_true * log(y_true / y_pred)

y_true and y_pred are expected to be probability distributions, with values between 0 and 1. They will get clipped to the [0, 1] range.

reduction Type of reduction to apply to the loss. In almost all cases this should be "sum_over_batch_size". Supported options are "sum", "sum_over_batch_size" or None.
name Optional name for the loss instance.

Methods

call

View source

from_config

View source

get_config

View source

__call__

View source

Call self as a function.