tf.keras.metrics.KLDivergence

Computes Kullback-Leibler divergence metric between y_true and y_pred.

Inherits From: MeanMetricWrapper, Mean, Metric

Formula:

metric = y_true * log(y_true / y_pred)

y_true and y_pred are expected to be probability distributions, with values between 0 and 1. They will get clipped to the [0, 1] range.

name (Optional) string name of the metric instance.
dtype (Optional) data type of the metric result.

Examples:

m = keras.metrics.KLDivergence()
m.update_state([[0, 1], [0, 0]], [[0.6, 0.4], [0.4, 0.6]])
m.result()
0.45814306
m.reset_state()
m.update_state([[0, 1], [0, 0]], [[0.6, 0.4], [0.4, 0.6]],
               sample_weight=[1, 0])
m.result()
0.9162892

Usage with compile() API:

model.compile(optimizer='sgd',
              loss='mse',
              metrics=[keras.metrics.KLDivergence()])

dtype

variables

Methods

add_variable

View source

add_weight

View source

from_config

View source

get_config

View source

Return the serializable config of the metric.

reset_state

View source

Reset all of the metric state variables.

This function is called between epochs/steps, when a metric is evaluated during training.

result

View source

Compute the current metric value.

Returns
A scalar tensor, or a dictionary of scalar tensors.

stateless_reset_state

View source

stateless_result

View source

stateless_update_state

View source

update_state

View source

Accumulate statistics for the metric.

__call__

View source

Call self as a function.