Attend the Women in ML Symposium on December 7 Register now

tfp.bijectors.Chain

Bijector which applies a composition of bijectors.

Inherits From: Composition, AutoCompositeTensorBijector, Bijector, AutoCompositeTensor

Example Use:

  chain = Chain([Exp(), Softplus()], name="one_plus_exp")

Results in:

  • Forward:

    exp = Exp()
    softplus = Softplus()
    Chain([exp, softplus]).forward(x)
    = exp.forward(softplus.forward(x))
    = tf.exp(tf.log(1. + tf.exp(x)))
    = 1. + tf.exp(x)
    
  • Inverse:

    exp = Exp()
    softplus = Softplus()
    Chain([exp, softplus]).inverse(y)
    = softplus.inverse(exp.inverse(y))
    = tf.log(tf.exp(tf.log(y)) - 1.)
    = tf.log(y - 1.)
    

    Keyword arguments can be passed to the inner bijectors by utilizing the inner bijector names, e.g.:

    chain = Chain([Bijector1(name='b1'), Bijector2(name='b2')])
    y = chain.forward(x, b1={'arg': 1}, b2={'arg': 2})
    
    # Equivalent to:
    z = Bijector2().forward(x, arg=1)
    y = Bijector1().forward(z, arg=2)
    

If every element of the bijectors list is a CompositeTensor, the resulting Chain bijector is a CompositeTensor as well. If any element of bijectors is not a CompositeTensor, then a non-CompositeTensor _Chain instance is created instead. Bijector subclasses that inherit from Chain will also inherit from CompositeTensor.

bijectors Python list of bijector instances. An empty list makes this bijector equivalent to the Identity bijector. The bijectors are applied in sequence starting from the end of the list.
validate_args Python bool indicating whether arguments should be checked for correctness.
validate_event_size Checks that bijectors are not applied to inputs with incomplete support (that is, inputs where one or more elements are a deterministic transformation of the others). For example, the following LDJ would be incorrect: Chain([Scale(), SoftmaxCentered()]).forward_log_det_jacobian([1], [1]) The jacobian contribution from Scale applies to a 2-dimensional input, but the output from SoftMaxCentered is a 1-dimensional input embedded in a 2-dimensional space. Setting validate_event_size=True (default) prints warnings in these cases. When validate_args is also True, the warning is promoted to an exception.
parameters Locals dict captured by subclass constructor, to be used for copy/slice re-instantiation operators.
name Python str, name given to ops managed by this object. Default: E.g., Chain([Exp(), Softplus()]).name == "chain_of_exp_of_softplus".

ValueError if bijectors have different dtypes.

bijectors

dtype

forward_min_event_ndims Returns the minimal number of dimensions bijector.forward operates on.

Multipart bijectors return structured ndims, which indicates the expected structure of their inputs. Some multipart bijectors, notably Composites, may return structures of None.

graph_parents Returns this Bijector's graph_parents as a Python list.
has_static_min_event_ndims Returns True if the bijector has statically-known min_event_ndims. (deprecated)

inverse_min_event_ndims Returns the minimal number of dimensions bijector.inverse operates on.

Multipart bijectors return structured event_ndims, which indicates the expected structure of their outputs. Some multipart bijectors, notably Composites, may return structures of None.

is_constant_jacobian Returns true iff the Jacobian matrix is not a function of x.

name Returns the string name of this Bijector.
name_scope Returns a tf.name_scope instance for this class.
non_trainable_variables Sequence of non-trainable variables owned by this module and its submodules.
parameters Dictionary of parameters used to instantiate this Bijector.
submodules Sequence of all sub-modules.

Submodules are modules which are properties of this module, or found as properties of modules which are properties of this module (and so on).

a = tf.Module()
b = tf.Module()
c = tf.Module()
a.b = b
b.c = c
list(a.submodules) == [b, c]
True
list(b.submodules) == [c]
True
list(c.submodules) == []
True

trainable_variables Sequence of trainable variables owned by this module and its submodules.

validate_args Returns True if Tensor arguments will be validated.
validate_event_size

variables Sequence of variables owned by this module and its submodules.

Methods

copy

View source

Creates a copy of the bijector.

Args
**override_parameters_kwargs String/value dictionary of initialization arguments to override with new values.

Returns
bijector A new instance of type(self) initialized from the union of self.parameters and override_parameters_kwargs, i.e., dict(self.parameters, **override_parameters_kwargs).

experimental_batch_shape

View source

Returns the batch shape of this bijector for inputs of the given rank.

The batch shape of a bijector decribes the set of distinct transformations it represents on events of a given size. For example: the bijector tfb.Scale([1., 2.]) has batch shape [2] for scalar events (event_ndims = 0), because applying it to a scalar event produces two scalar outputs, the result of two different scaling transformations. The same bijector has batch shape [] for vector events, because applying it to a vector produces (via elementwise multiplication) a single vector output.

Bijectors that operate independently on multiple state parts, such as tfb.JointMap, must broadcast to a coherent batch shape. Some events may not be valid: for example, the bijector tfd.JointMap([tfb.Scale([1., 2.]), tfb.Scale([1., 2., 3.])]) does not produce a valid batch shape when event_ndims = [0, 0], since the batch shapes of the two parts are inconsistent. The same bijector does define valid batch shapes of [], [2], and [3] if event_ndims is [1, 1], [0, 1], or [1, 0], respectively.

Since transforming a single event produces a scalar log-det-Jacobian, the batch shape of a bijector with non-constant Jacobian is expected to equal the shape of forward_log_det_jacobian(x, event_ndims=x_event_ndims) or inverse_log_det_jacobian(y, event_ndims=y_event_ndims), for x or y of the specified ndims.

Args
x_event_ndims Optional Python int (structure) number of dimensions in a probabilistic event passed to forward; this must be greater than or equal to self.forward_min_event_ndims. If None, defaults to self.forward_min_event_ndims. Mutually exclusive with y_event_ndims. Default value: None.
y_event_ndims Optional Python int (structure) number of dimensions in a probabilistic event passed to inverse; this must be greater than or equal to self.inverse_min_event_ndims. Mutually exclusive with x_event_ndims. Default value: None.

Returns
batch_shape TensorShape batch shape of this bijector for a value with the given event rank. May be unknown or partially defined.

experimental_batch_shape_tensor

View source

Returns the batch shape of this bijector for inputs of the given rank.

The batch shape of a bijector decribes the set of distinct transformations it represents on events of a given size. For example: the bijector tfb.Scale([1., 2.]) has batch shape [2] for scalar events (event_ndims = 0), because applying it to a scalar event produces two scalar outputs, the result of two different scaling transformations. The same bijector has batch shape [] for vector events, because applying it to a vector produces (via elementwise multiplication) a single vector output.

Bijectors that operate independently on multiple state parts, such as tfb.JointMap, must broadcast to a coherent batch shape. Some events may not be valid: for example, the bijector tfd.JointMap([tfb.Scale([1., 2.]), tfb.Scale([1., 2., 3.])]) does not produce a valid batch shape when event_ndims = [0, 0], since the batch shapes of the two parts are inconsistent. The same bijector does define valid batch shapes of [], [2], and [3] if event_ndims is [1, 1], [0, 1], or [1, 0], respectively.

Since transforming a single event produces a scalar log-det-Jacobian, the batch shape of a bijector with non-constant Jacobian is expected to equal the shape of forward_log_det_jacobian(x, event_ndims=x_event_ndims) or inverse_log_det_jacobian(y, event_ndims=y_event_ndims), for x or y of the specified ndims.

Args
x_event_ndims Optional Python int (structure) number of dimensions in a probabilistic event passed to forward; this must be greater than or equal to self.forward_min_event_ndims. If None, defaults to self.forward_min_event_ndims. Mutually exclusive with y_event_ndims. Default value: None.
y_event_ndims Optional Python int (structure) number of dimensions in a probabilistic event passed to inverse; this must be greater than or equal to self.inverse_min_event_ndims. Mutually exclusive with x_event_ndims. Default value: None.

Returns
batch_shape_tensor integer