Have a question? Connect with the community at the TensorFlow Forum Visit Forum

tfp.experimental.bijectors.forward_log_det_jacobian_ratio

Computes p.fldj(x, ndims) - q.fdlj(y, ndims), numerically stably.

p A bijector instance.
x A tensor from the preimage of p.forward.
q A bijector instance of the same type as p, with matching shape.
y A tensor from the preimage of q.forward.
event_ndims The number of right-hand dimensions comprising the event shapes of x and y.
use_kahan_sum When True, the reduction of any remaining event_ndims beyond the minimum is done using Kahan summation. This requires statically known ranks.

fldj_ratio log ((abs o det o jac p)(x) / (abs o det o jac q)(y)), i.e. in TFP code, p.forward_log_det_jacobian(x, event_ndims) - q.forward_log_det_jacobian(y, event_ndims). In some cases this will be computed with better than naive numerical precision, e.g. by moving differences inside of a sum reduction.