Kullback Leibler Divergence

tf.contrib.distributions.kl(dist_a, dist_b, allow_nan=False, name=None)

Get the KL-divergence KL(dist_a || dist_b).

Args:
  • dist_a: The first distribution.
  • dist_b: The second distribution.
  • allow_nan: If False (default), a runtime error is raised if the KL returns NaN values for any batch entry of the given distributions. If True, the KL may return a NaN for the given entry.
  • name: (optional) Name scope to use for created operations.
Returns:

A Tensor with the batchwise KL-divergence between dist_a and dist_b.

Raises:
  • NotImplementedError: If no KL method is defined for distribution types of dist_a and dist_b.

class tf.contrib.distributions.RegisterKL

Decorator to register a KL divergence implementation function.

Usage:

@distributions.RegisterKL(distributions.Normal, distributions.Normal) def _kl_normal_mvn(norm_a, norm_b): # Return KL(norm_a || norm_b)


tf.contrib.distributions.RegisterKL.__call__(kl_fn) {:#RegisterKL.call}

Perform the KL registration.

Args:
  • kl_fn: The function to use for the KL divergence.
Returns:

kl_fn

Raises:
  • TypeError: if kl_fn is not a callable.
  • ValueError: if a KL divergence function has already been registered for the given argument classes.

tf.contrib.distributions.RegisterKL.__init__(dist_cls_a, dist_cls_b) {:#RegisterKL.init}

Initialize the KL registrar.

Args:
  • dist_cls_a: the class of the first argument of the KL divergence.
  • dist_cls_b: the class of the second argument of the KL divergence.