When self_normalized = True the modified-GAN (Generative/Adversarial
Network) Csiszar-function is:
f(u) = log(1 + u) - log(u) + 0.5 (u - 1)
When self_normalized = False the 0.5 (u - 1) is omitted.
The unmodified GAN Csiszar-function is identical to Jensen-Shannon (with
self_normalized = False).
Args
logu
float-like Tensor representing log(u) from above.
self_normalized
Python bool indicating whether f'(u=1)=0. When
f'(u=1)=0 the implied Csiszar f-Divergence remains non-negative even
when p, q are unnormalized measures.
name
Python str name prefixed to Ops created by this function.
Returns
chi_square_of_u
float-like Tensor of the Csiszar-function evaluated
at u = exp(logu).
[null,null,["Last updated 2023-11-21 UTC."],[],[],null,["# tfp.vi.modified_gan\n\n\u003cbr /\u003e\n\n|--------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/probability/blob/v0.23.0/tensorflow_probability/python/vi/csiszar_divergence.py#L720-L761) |\n\nThe Modified-GAN Csiszar-function in log-space. \n\n tfp.vi.modified_gan(\n logu, self_normalized=False, name=None\n )\n\nA Csiszar-function is a member of, \n\n F = { f:R_+ to R : f convex }.\n\nWhen `self_normalized = True` the modified-GAN (Generative/Adversarial\nNetwork) Csiszar-function is: \n\n f(u) = log(1 + u) - log(u) + 0.5 (u - 1)\n\nWhen `self_normalized = False` the `0.5 (u - 1)` is omitted.\n\nThe unmodified GAN Csiszar-function is identical to Jensen-Shannon (with\n`self_normalized = False`).\n| **Warning:** this function makes non-log-space calculations and may therefore be numerically unstable for `|logu| \u003e\u003e 0`.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `logu` | `float`-like `Tensor` representing `log(u)` from above. |\n| `self_normalized` | Python `bool` indicating whether `f'(u=1)=0`. When `f'(u=1)=0` the implied Csiszar f-Divergence remains non-negative even when `p, q` are unnormalized measures. |\n| `name` | Python `str` name prefixed to Ops created by this function. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|-------------------|-----------------------------------------------------------------------------|\n| `chi_square_of_u` | `float`-like `Tensor` of the Csiszar-function evaluated at `u = exp(logu)`. |\n\n\u003cbr /\u003e"]]