View source on GitHub |
The forward Kullback-Leibler Csiszar-function in log-space.
tfp.substrates.jax.vi.kl_forward(
logu, self_normalized=False, name=None
)
A Csiszar-function is a member of,
F = { f:R_+ to R : f convex }.
When self_normalized = True
, the KL-forward Csiszar-function is:
f(u) = u log(u) - (u - 1)
When self_normalized = False
the (u - 1)
term is omitted.
Observe that as an f-Divergence, this Csiszar-function implies:
D_f[p, q] = KL[p, q]
The KL is "forward" because in maximum likelihood we think of minimizing q
as in KL[p, q]
.
Returns | |
---|---|
kl_forward_of_u
|
float -like Tensor of the Csiszar-function evaluated at
u = exp(logu) .
|
Raises | |
---|---|
TypeError
|
if self_normalized is None or a Tensor .
|