Negative Python int indicating the axis along which to compute the
cumulative sum. Note that positive (and zero) values are not supported.
validate_args
Python bool indicating whether arguments should be
checked for correctness.
name
Python str name given to ops managed by this object.
Raises
TypeError
if axis is not an int.
ValueError
if axis is not negative.
Attributes
axis
Returns the axis over which this Bijector computes the cumsum.
dtype
forward_min_event_ndims
Returns the minimal number of dimensions bijector.forward operates on.
Multipart bijectors return structured ndims, which indicates the
expected structure of their inputs. Some multipart bijectors, notably
Composites, may return structures of None.
graph_parents
Returns this Bijector's graph_parents as a Python list.
inverse_min_event_ndims
Returns the minimal number of dimensions bijector.inverse operates on.
Multipart bijectors return structured event_ndims, which indicates the
expected structure of their outputs. Some multipart bijectors, notably
Composites, may return structures of None.
is_constant_jacobian
Returns true iff the Jacobian matrix is not a function of x.
String/value dictionary of initialization
arguments to override with new values.
Returns
bijector
A new instance of type(self) initialized from the union
of self.parameters and override_parameters_kwargs, i.e.,
dict(self.parameters, **override_parameters_kwargs).
Returns the batch shape of this bijector for inputs of the given rank.
The batch shape of a bijector decribes the set of distinct
transformations it represents on events of a given size. For example: the
bijector tfb.Scale([1., 2.]) has batch shape [2] for scalar events
(event_ndims = 0), because applying it to a scalar event produces
two scalar outputs, the result of two different scaling transformations.
The same bijector has batch shape [] for vector events, because applying
it to a vector produces (via elementwise multiplication) a single vector
output.
Bijectors that operate independently on multiple state parts, such as
tfb.JointMap, must broadcast to a coherent batch shape. Some events may
not be valid: for example, the bijector
tfd.JointMap([tfb.Scale([1., 2.]), tfb.Scale([1., 2., 3.])]) does not
produce a valid batch shape when event_ndims = [0, 0], since the batch
shapes of the two parts are inconsistent. The same bijector
does define valid batch shapes of [], [2], and [3] if event_ndims
is [1, 1], [0, 1], or [1, 0], respectively.
Since transforming a single event produces a scalar log-det-Jacobian, the
batch shape of a bijector with non-constant Jacobian is expected to equal
the shape of forward_log_det_jacobian(x, event_ndims=x_event_ndims)
or inverse_log_det_jacobian(y, event_ndims=y_event_ndims), for x
or y of the specified ndims.
Args
x_event_ndims
Optional Python int (structure) number of dimensions in
a probabilistic event passed to forward; this must be greater than
or equal to self.forward_min_event_ndims. If None, defaults to
self.forward_min_event_ndims. Mutually exclusive with y_event_ndims.
Default value: None.
y_event_ndims
Optional Python int (structure) number of dimensions in
a probabilistic event passed to inverse; this must be greater than
or equal to self.inverse_min_event_ndims. Mutually exclusive with
x_event_ndims.
Default value: None.
Returns
batch_shape
TensorShape batch shape of this bijector for a
value with the given event rank. May be unknown or partially defined.
Returns the batch shape of this bijector for inputs of the given rank.
The batch shape of a bijector decribes the set of distinct
transformations it represents on events of a given size. For example: the
bijector tfb.Scale([1., 2.]) has batch shape [2] for scalar events
(event_ndims = 0), because applying it to a scalar event produces
two scalar outputs, the result of two different scaling transformations.
The same bijector has batch shape [] for vector events, because applying
it to a vector produces (via elementwise multiplication) a single vector
output.
Bijectors that operate independently on multiple state parts, such as
tfb.JointMap, must broadcast to a coherent batch shape. Some events may
not be valid: for example, the bijector
tfd.JointMap([tfb.Scale([1., 2.]), tfb.Scale([1., 2., 3.])]) does not
produce a valid batch shape when event_ndims = [0, 0], since the batch
shapes of the two parts are inconsistent. The same bijector
does define valid batch shapes of [], [2], and [3] if event_ndims
is [1, 1], [0, 1], or [1, 0], respectively.
Since transforming a single event produces a scalar log-det-Jacobian, the
batch shape of a bijector with non-constant Jacobian is expected to equal
the shape of forward_log_det_jacobian(x, event_ndims=x_event_ndims)
or inverse_log_det_jacobian(y, event_ndims=y_event_ndims), for x
or y of the specified ndims.
Args
x_event_ndims
Optional Python int (structure) number of dimensions in
a probabilistic event passed to forward; this must be greater than
or equal to self.forward_min_event_ndims. If None, defaults to
self.forward_min_event_ndims. Mutually exclusive with y_event_ndims.
Default value: None.
y_event_ndims
Optional Python int (structure) number of dimensions in
a probabilistic event passed to inverse; this must be greater than
or equal to self.inverse_min_event_ndims. Mutually exclusive with
x_event_ndims.
Default value: None.
Returns
batch_shape_tensor
integer Tensor batch shape of this bijector for a
value with the given event rank.
Density correction for this transformation wrt the tangent space, at x.
Subclasses of Bijector may call the most specific applicable
method of TangentSpace, based on whether the transformation is
dimension-preserving, coordinate-wise, a projection, or something
more general. The backward-compatible assumption is that the
transformation is dimension-preserving (goes from R^n to R^n).
Args
x
Tensor (structure). The point at which to calculate the density.
tangent_space
TangentSpace or one of its subclasses. The tangent to
the support manifold at x.
backward_compat
bool specifying whether to assume that the Bijector
is dimension-preserving.
**kwargs
Optional keyword arguments forwarded to tangent space methods.
Returns
density_correction
Tensor representing the density correction---in log
space---under the transformation that this Bijector denotes.
Raises
TypeError if backward_compat is False but no method of
TangentSpace has been called explicitly.
Returns the number of event dimensions produced by forward.
Args
event_ndims
Structure of Python and/or Tensor ints, and/or None
values. The structure should match that of
self.forward_min_event_ndims, and all non-None values must be
greater than or equal to the corresponding value in
self.forward_min_event_ndims.
**kwargs
Optional keyword arguments forwarded to nested bijectors.
Returns
forward_event_ndims
Structure of integers and/or None values matching
self.inverse_min_event_ndims. These are computed using 'prefer static'
semantics: if any inputs are None, some or all of the outputs may be
None, indicating that the output dimension could not be inferred
(conversely, if all inputs are non-None, all outputs will be
non-None). If all input event_ndims are Python ints, all of the
(non-None) outputs will be Python ints; otherwise, some or
all of the outputs may be Tensorints.
Tensor (structure). The input to the 'forward' Jacobian determinant
evaluation.
event_ndims
Optional number of dimensions in the probabilistic events
being transformed; this must be greater than or equal to
self.forward_min_event_ndims. If event_ndims is specified, the
log Jacobian determinant is summed to produce a
scalar log-determinant for each event. Otherwise
(if event_ndims is None), no reduction is performed.
Multipart bijectors require structured event_ndims, such that the
batch rank rank(y[i]) - event_ndims[i] is the same for all
elements i of the structured input. In most cases (with the
exception of tfb.JointMap) they further require that
event_ndims[i] - self.inverse_min_event_ndims[i] is the same for
all elements i of the structured input.
Default value: None (equivalent to self.forward_min_event_ndims).
name
The name to give this op.
**kwargs
Named arguments forwarded to subclass implementation.
Returns
Tensor (structure), if this bijector is injective.
If not injective this is not implemented.
Raises
TypeError
if y's dtype is incompatible with the expected output dtype.
NotImplementedError
if neither _forward_log_det_jacobian
nor {_inverse, _inverse_log_det_jacobian} are implemented, or
this is a non-injective bijector.
ValueError
if the value of event_ndims is not valid for this bijector.
Returns the inverse Bijector evaluation, i.e., X = g^{-1}(Y).
Args
y
Tensor (structure). The input to the 'inverse' evaluation.
name
The name to give this op.
**kwargs
Named arguments forwarded to subclass implementation.
Returns
Tensor (structure), if this bijector is injective.
If not injective, returns the k-tuple containing the unique
k points (x1, ..., xk) such that g(xi) = y.
Raises
TypeError
if y's structured dtype is incompatible with the expected
output dtype.
Returns the number of event dimensions produced by inverse.
Args
event_ndims
Structure of Python and/or Tensor ints, and/or None
values. The structure should match that of
self.inverse_min_event_ndims, and all non-None values must be
greater than or equal to the corresponding value in
self.inverse_min_event_ndims.
**kwargs
Optional keyword arguments forwarded to nested bijectors.
Returns
inverse_event_ndims
Structure of integers and/or None values matching
self.forward_min_event_ndims. These are computed using 'prefer static'
semantics: if any inputs are None, some or all of the outputs may be
None, indicating that the output dimension could not be inferred
(conversely, if all inputs are non-None, all outputs will be
non-None). If all input event_ndims are Python ints, all of the
(non-None) outputs will be Python ints; otherwise, some or
all of the outputs may be Tensorints.
Note that forward_log_det_jacobian is the negative of this function,
evaluated at g^{-1}(y).
Args
y
Tensor (structure). The input to the 'inverse' Jacobian determinant
evaluation.
event_ndims
Optional number of dimensions in the probabilistic events
being transformed; this must be greater than or equal to
self.inverse_min_event_ndims. If event_ndims is specified, the
log Jacobian determinant is summed to produce a
scalar log-determinant for each event. Otherwise
(if event_ndims is None), no reduction is performed.
Multipart bijectors require structured event_ndims, such that the
batch rank rank(y[i]) - event_ndims[i] is the same for all
elements i of the structured input. In most cases (with the
exception of tfb.JointMap) they further require that
event_ndims[i] - self.inverse_min_event_ndims[i] is the same for
all elements i of the structured input.
Default value: None (equivalent to self.inverse_min_event_ndims).
name
The name to give this op.
**kwargs
Named arguments forwarded to subclass implementation.
Returns
ildj
Tensor, if this bijector is injective.
If not injective, returns the tuple of local log det
Jacobians, log(det(Dg_i^{-1}(y))), where g_i is the restriction
of g to the ith partition Di.
Raises
TypeError
if x's dtype is incompatible with the expected inverse-dtype.
NotImplementedError
if _inverse_log_det_jacobian is not implemented.
ValueError
if the value of event_ndims is not valid for this bijector.
Returns a dict mapping constructor arg names to property annotations.
This dict should include an entry for each of the bijector's
Tensor-valued constructor arguments.
Args
dtype
Optional float dtype to assume for continuous-valued parameters.
Some constraining bijectors require advance knowledge of the dtype
because certain constants (e.g., tfb.Softplus.low) must be
instantiated with the same dtype as the values to be transformed.
Returns
parameter_properties
A
str ->tfp.python.internal.parameter_properties.ParameterPropertiesdict mapping constructor argument names toParameterProperties`
instances.
with_name_scope
@classmethodwith_name_scope(
method
)
Decorator to automatically enter the module name scope.
class MyModule(tf.Module): @tf.Module.with_name_scope def __call__(self, x): if not hasattr(self, 'w'): self.w = tf.Variable(tf.random.normal([x.shape[1], 3])) return tf.matmul(x, self.w)
Using the above module would produce tf.Variables and tf.Tensors whose
names included the module name:
Applies or composes the Bijector, depending on input type.
This is a convenience function which applies the Bijector instance in
three different ways, depending on the input:
If the input is a tfd.Distribution instance, return
tfd.TransformedDistribution(distribution=input, bijector=self).
If the input is a tfb.Bijector instance, return
tfb.Chain([self, input]).
Otherwise, return self.forward(input)
Args
value
A tfd.Distribution, tfb.Bijector, or a (structure of) Tensor.
name
Python str name given to ops created by this function.
**kwargs
Additional keyword arguments passed into the created
tfd.TransformedDistribution, tfb.Bijector, or self.forward.
Returns
composition
A tfd.TransformedDistribution if the input was a
tfd.Distribution, a tfb.Chain if the input was a tfb.Bijector, or
a (structure of) Tensor computed by self.forward.