tfp.vi.kl_forward

The forward Kullback-Leibler Csiszar-function in log-space.

A Csiszar-function is a member of,

F = { f:R_+ to R : f convex }.

When self_normalized = True, the KL-forward Csiszar-function is:

f(u) = u log(u) - (u - 1)

When self_normalized = False the (u - 1) term is omitted.

Observe that as an f-Divergence, this Csiszar-function implies:

D_f[p, q] = KL[p, q]

The KL is "forward" because in maximum likelihood we think of minimizing q as in KL[p, q].

logu float-like Tensor representing log(u) from above.
self_normalized Python bool indicating whether f'(u=1)=0. When f'(u=1)=0 the implied Csiszar f-Divergence remains non-negative even when p, q are unnormalized measures.
name Python str name prefixed to Ops created by this function.

kl_forward_of_u float-like Tensor of the Csiszar-function evaluated at u = exp(logu).

TypeError if self_normalized is None or a Tensor.