Computes Kullback-Leibler divergence loss between y_true
& y_pred
.
View aliases
Main aliases
tf.keras.losses.KLD
, tf.keras.losses.kl_divergence
, tf.keras.losses.kld
, tf.keras.losses.kullback_leibler_divergence
, tf.keras.metrics.KLD
, tf.keras.metrics.kld
, tf.keras.metrics.kullback_leibler_divergence
, tf.losses.KLD
, tf.losses.kl_divergence
, tf.losses.kld
, tf.losses.kullback_leibler_divergence
, tf.metrics.KLD
, tf.metrics.kl_divergence
, tf.metrics.kld
, tf.metrics.kullback_leibler_divergence
Compat aliases for migration
See
Migration guide for
more details.
`tf.compat.v1.keras.losses.KLD`, `tf.compat.v1.keras.losses.kl_divergence`, `tf.compat.v1.keras.losses.kld`, `tf.compat.v1.keras.losses.kullback_leibler_divergence`, `tf.compat.v1.keras.metrics.KLD`, `tf.compat.v1.keras.metrics.kl_divergence`, `tf.compat.v1.keras.metrics.kld`, `tf.compat.v1.keras.metrics.kullback_leibler_divergence`
tf.keras.metrics.kl_divergence(
y_true, y_pred
)
loss = y_true * log(y_true / y_pred)
See: https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence
Standalone usage:
y_true = np.random.randint(0, 2, size=(2, 3)).astype(np.float64)
y_pred = np.random.random(size=(2, 3))
loss = tf.keras.losses.kullback_leibler_divergence(y_true, y_pred)
assert loss.shape == (2,)
y_true = tf.keras.backend.clip(y_true, 1e-7, 1)
y_pred = tf.keras.backend.clip(y_pred, 1e-7, 1)
assert np.array_equal(
loss.numpy(), np.sum(y_true * np.log(y_true / y_pred), axis=-1))
Args |
y_true
|
Tensor of true targets.
|
y_pred
|
Tensor of predicted targets.
|
Returns |
A Tensor with loss.
|
Raises |
TypeError
|
If y_true cannot be cast to the y_pred.dtype .
|