tf.keras.losses.KLDivergence

Computes Kullback-Leibler divergence loss between y_true & y_pred.

Inherits From: Loss

Formula:

loss = y_true * log(y_true / y_pred)

reduction Type of reduction to apply to the loss. In almost all cases this should be "sum_over_batch_size". Supported options are "sum", "sum_over_batch_size" or None.
name Optional name for the loss instance.

Methods

call

View source

from_config

View source

get_config

View source

__call__

View source

Call self as a function.