RSVP for your your local TensorFlow Everywhere event today!


TensorFlow 2 version View source on GitHub

Computes Kullback-Leibler divergence loss between y_true and y_pred.

loss = y_true * log(y_true / y_pred)



loss = tf.keras.losses.KLD([.4, .9, .2], [.5, .8, .12])
print('Loss: ', loss.numpy())  # Loss: 0.11891246

y_true Tensor of true targets.
y_pred Tensor of predicted targets.

A Tensor with loss.

TypeError If y_true cannot be cast to the y_pred.dtype.