When self_normalized = True, the Amari-alpha Csiszar-function is:
f(u) = { -log(u) + (u - 1), alpha = 0
{ u log(u) - (u - 1), alpha = 1
{ [(u**alpha - 1) - alpha (u - 1)] / (alpha (alpha - 1)), otherwise
When self_normalized = False the (u - 1) terms are omitted.
For more information, see:
A. Cichocki and S. Amari. "Families of Alpha-Beta-and GammaDivergences:
Flexible and Robust Measures of Similarities." Entropy, vol. 12, no. 6, pp.
1532-1568, 2010.
Args
logu
float-like Tensor representing log(u) from above.
alpha
float-like Python scalar. (See Mathematical Details for meaning.)
self_normalized
Python bool indicating whether f'(u=1)=0. When
f'(u=1)=0 the implied Csiszar f-Divergence remains non-negative even
when p, q are unnormalized measures.
name
Python str name prefixed to Ops created by this function.
Returns
amari_alpha_of_u
float-like Tensor of the Csiszar-function evaluated
at u = exp(logu).
[null,null,["Last updated 2023-11-21 UTC."],[],[],null,["# tfp.vi.amari_alpha\n\n\u003cbr /\u003e\n\n|--------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/probability/blob/v0.23.0/tensorflow_probability/python/vi/csiszar_divergence.py#L150-L218) |\n\nThe Amari-alpha Csiszar-function in log-space. \n\n tfp.vi.amari_alpha(\n logu, alpha=1.0, self_normalized=False, name=None\n )\n\nA Csiszar-function is a member of, \n\n F = { f:R_+ to R : f convex }.\n\nWhen `self_normalized = True`, the Amari-alpha Csiszar-function is: \n\n f(u) = { -log(u) + (u - 1), alpha = 0\n { u log(u) - (u - 1), alpha = 1\n { [(u**alpha - 1) - alpha (u - 1)] / (alpha (alpha - 1)), otherwise\n\nWhen `self_normalized = False` the `(u - 1)` terms are omitted.\n| **Warning:** when `alpha != 0` and/or `self_normalized = True` this function makes non-log-space calculations and may therefore be numerically unstable for `|logu| \u003e\u003e 0`.\n\nFor more information, see:\nA. Cichocki and S. Amari. \"Families of Alpha-Beta-and GammaDivergences:\nFlexible and Robust Measures of Similarities.\" Entropy, vol. 12, no. 6, pp.\n1532-1568, 2010.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `logu` | `float`-like `Tensor` representing `log(u)` from above. |\n| `alpha` | `float`-like Python scalar. (See Mathematical Details for meaning.) |\n| `self_normalized` | Python `bool` indicating whether `f'(u=1)=0`. When `f'(u=1)=0` the implied Csiszar f-Divergence remains non-negative even when `p, q` are unnormalized measures. |\n| `name` | Python `str` name prefixed to Ops created by this function. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|--------------------|-----------------------------------------------------------------------------|\n| `amari_alpha_of_u` | `float`-like `Tensor` of the Csiszar-function evaluated at `u = exp(logu)`. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|-------------|-----------------------------------------------|\n| `TypeError` | if `alpha` is `None` or a `Tensor`. |\n| `TypeError` | if `self_normalized` is `None` or a `Tensor`. |\n\n\u003cbr /\u003e"]]