View source on GitHub
  
 | 
The reverse Kullback-Leibler Csiszar-function in log-space.
tfp.vi.kl_reverse(
    logu, self_normalized=False, name=None
)
A Csiszar-function is a member of,
F = { f:R_+ to R : f convex }.
When self_normalized = True, the KL-reverse Csiszar-function is:
f(u) = -log(u) + (u - 1)
When self_normalized = False the (u - 1) term is omitted.
Observe that as an f-Divergence, this Csiszar-function implies:
D_f[p, q] = KL[q, p]
The KL is "reverse" because in maximum likelihood we think of minimizing q
as in KL[p, q].
Returns | |
|---|---|
kl_reverse_of_u
 | 
float-like Tensor of the Csiszar-function evaluated at
u = exp(logu).
 | 
Raises | |
|---|---|
TypeError
 | 
if self_normalized is None or a Tensor.
 | 
    View source on GitHub