View source on GitHub |
The Modified-GAN Csiszar-function in log-space.
tfp.substrates.jax.vi.modified_gan(
logu, self_normalized=False, name=None
)
A Csiszar-function is a member of,
F = { f:R_+ to R : f convex }.
When self_normalized = True
the modified-GAN (Generative/Adversarial
Network) Csiszar-function is:
f(u) = log(1 + u) - log(u) + 0.5 (u - 1)
When self_normalized = False
the 0.5 (u - 1)
is omitted.
The unmodified GAN Csiszar-function is identical to Jensen-Shannon (with
self_normalized = False
).
Returns | |
---|---|
chi_square_of_u
|
float -like Tensor of the Csiszar-function evaluated
at u = exp(logu) .
|