tfp.substrates.numpy.bijectors.AutoCompositeTensorBijector

<!-- Stable --> <table class="tfo-notebook-buttons tfo-api nocontent" align="left"> <td> <a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L103-L1609"> <img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png" /> View source on GitHub </a> </td> </table> Interface for transformations of a `Distribution` sample. <section class="expandable"> <h4 class="showalways">View aliases</h4> <p> <b>Main aliases</b> <p><a href="https://www.tensorflow.org/probability/api_docs/python/tfp/substrates/numpy/bijectors/AutoCompositeTensorBijector"><code>tfp.experimental.substrates.numpy.bijectors.AutoCompositeTensorBijector</code></a>, <a href="https://www.tensorflow.org/probability/api_docs/python/tfp/substrates/numpy/bijectors/AutoCompositeTensorBijector"><code>tfp.experimental.substrates.numpy.bijectors.Bijector</code></a>, <a href="https://www.tensorflow.org/probability/api_docs/python/tfp/substrates/numpy/bijectors/AutoCompositeTensorBijector"><code>tfp.substrates.numpy.bijectors.Bijector</code></a></p> </p> </section> <pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link"> <code>@abc.abstractmethod</code> <code>tfp.substrates.numpy.bijectors.AutoCompositeTensorBijector( graph_parents=None, is_constant_jacobian=False, validate_args=False, dtype=None, forward_min_event_ndims=UNSPECIFIED, inverse_min_event_ndims=UNSPECIFIED, parameters=None, name=None ) </code></pre> <!-- Placeholder for "Used in" --> Bijectors can be used to represent any differentiable and injective (one to one) function defined on an open subset of `R^n`. Some non-injective transformations are also supported (see 'Non Injective Transforms' below). #### Mathematical Details A `Bijector` implements a [smooth covering map]( https://en.wikipedia.org/wiki/Local_diffeomorphism), i.e., a local diffeomorphism such that every point in the target has a neighborhood evenly covered by a map ([see also]( https://en.wikipedia.org/wiki/Covering_space#Covering_of_a_manifold)). A `Bijector` is used by `TransformedDistribution` but can be generally used for transforming a `Distribution` generated `Tensor`. A `Bijector` is characterized by three operations: 1. Forward Useful for turning one random outcome into another random outcome from a different distribution. 2. Inverse Useful for 'reversing' a transformation to compute one probability in terms of another. 3. `log_det_jacobian(x)` 'The log of the absolute value of the determinant of the matrix of all first-order partial derivatives of the inverse function.' Useful for inverting a transformation to compute one probability in terms of another. Geometrically, the Jacobian determinant is the volume of the transformation and is used to scale the probability. We take the absolute value of the determinant before log to avoid NaN values. Geometrically, a negative determinant corresponds to an orientation-reversing transformation. It is ok for us to discard the sign of the determinant because we only integrate everywhere-nonnegative functions (probability densities) and the correct orientation is always the one that produces a nonnegative integrand. By convention, transformations of random variables are named in terms of the forward transformation. The forward transformation creates samples, the inverse is useful for computing probabilities. #### Example Uses - Basic properties: ```python x = ... # A tensor. # Evaluate forward transformation. fwd_x = my_bijector.forward(x) x == my_bijector.inverse(fwd_x) x != my_bijector.forward(fwd_x) # Not equal because x != g(g(x)). ``` - Computing a log-likelihood: ```python def transformed_log_prob(bijector, log_prob, x): return (bijector.inverse_log_det_jacobian(x, event_ndims=0) + log_prob(bijector.inverse(x))) ``` - Transforming a random outcome: ```python def transformed_sample(bijector, x): return bijector.forward(x) ``` #### Example Bijectors - 'Exponential' ```none Y = g(X) = exp(X) X ~ Normal(0, 1) # Univariate.

Implies:

    g^{-1}(Y) = log(Y)
    |Jacobian(g^{-1})(y)| = 1 / y
    Y ~ LogNormal(0, 1), i.e.,
    prob(Y=y) = |Jacobian(g^{-1})(y)| * prob(X=g^{-1}(y))
              = (1 / y) Normal(log(y); 0, 1)

Here is an example of how one might implement the Exp bijector:

    class Exp(Bijector):

      def __init__(self, validate_args=False, name='exp'):
        super(Exp, self).__init__(
            validate_args=validate_args,
            forward_min_event_ndims=0,
            name=name)

      def _forward(self, x):
        return tf.exp(x)

      def _inverse(self, y):
        return tf.log(y)

      def _inverse_log_det_jacobian(self, y):
        return -self._forward_log_det_jacobian(self._inverse(y))

      def _forward_log_det_jacobian(self, x):
        # Notice that we needn't do any reducing, even when`event_ndims > 0`.
        # The base Bijector class will handle reducing for us; it knows how
        # to do so because we called `super` `__init__` with
        # `forward_min_event_ndims = 0`.
        return x
    ```

- 'Affine'

  ```none
  Y = g(X) = sqrtSigma * X + mu
  X ~ MultivariateNormal(0, I_d)

Implies:

    g^{-1}(Y) = inv(sqrtSigma) * (Y - mu)
    |Jacobian(g^{-1})(y)| = det(inv(sqrtSigma))
    Y ~ MultivariateNormal(mu, sqrtSigma) , i.e.,
    prob(Y=y) = |Jacobian(g^{-1})(y)| * prob(X=g^{-1}(y))
              = det(sqrtSigma)^(-d) *
                MultivariateNormal(inv(sqrtSigma) * (y - mu); 0, I_d)
    ```

#### Min_event_ndims and Naming

Bijectors are named for the dimensionality of data they act on (i.e. without
broadcasting). We can think of bijectors having an intrinsic `min_event_ndims`
, which is the minimum number of dimensions for the bijector act on. For
instance, a Cholesky decomposition requires a matrix, and hence
`min_event_ndims=2`.

#### Some examples:



`AffineScalar:  min_event_ndims=0`
`Affine:  min_event_ndims=1`
`Cholesky:  min_event_ndims=2`
`Exp:  min_event_ndims=0`
`Sigmoid:  min_event_ndims=0`
`SoftmaxCentered:  min_event_ndims=1`

Note the difference between `Affine` and `AffineScalar`. `AffineScalar`
operates on scalar events, whereas `Affine` operates on vector-valued events.

More generally, there is a `forward_min_event_ndims` and an
`inverse_min_event_ndims`. In most cases, these will be the same.
However, for some shape changing bijectors, these will be different
(e.g. a bijector which pads an extra dimension at the end, might have
`forward_min_event_ndims=0` and `inverse_min_event_ndims=1`.

##### Additional Considerations for "Multi Tensor" Bijectors

Bijectors which operate on structures of `Tensor` require structured
`min_event_ndims` matching the structure of the inputs. In these cases,
`min_event_ndims` describes both the minimum dimensionality *and* the
structure of arguments to `forward` and `inverse`. For example:

```
Split([sizes], axis):
  forward_min_event_ndims=-axis
  inverse_min_event_ndims=[-axis] * len(sizes)
```

Note: By default, we require `shape(x[i])[-event_ndims:-min_event_ndims]` to
be identical for all elements `i` of the structured input `x`. Specifically,
broadcasting over non-minimal event-dims is generally not allowed for
structured inputs, with the exception described in the next paragraph.

**Independent parts**: multipart transformations in which the parts do not
interact with each other, such as `tfd.JointMap`, `tfd.Restructure`, and
chains of these, may allow `event_ndims[i] - min_event_ndims[i]` to take
different values across different parts. The parts must still share a common
(broadcast) batch shape---the shape of the log Jacobian determinant---
but independence removes the requirement for further alignment in the event
shapes. For example, a `JointMap` bijector may be used to transform
distributions of varying event rank and size, even when other multipart
bijectors such as `tfb.Invert(tfb.Split(n))` would require all inputs to have
the same event rank:

```python
jm = tfb.JointMap([tfb.Scale([1., 2.],
                   tfb.Scale([3., 4., 5.]))])

fldj = jm.forward_log_det_jacobian([tf.ones([2]), tf.ones([3])],
                                    event_ndims=[1, 1])
# ==> `fldj` has shape `[]`.

fldj = jm.forward_log_det_jacobian([tf.ones([2]), tf.ones([3])],
                                    event_ndims=[1, 0])
# ==> `fldj` has shape `[3]` (the shape-`[2]` input part is implicitly
#      broadcast to shape `[3, 2]`, creating a common batch shape).

fldj = jm.forward_log_det_jacobian([tf.ones([2]), tf.ones([3])],
                                    event_ndims=[0, 0])
# ==> Error; `[2]` and `[3]` do not broadcast to a consistent batch shape.

```

#### Jacobian Determinant

The Jacobian determinant of a single-part bijector is a reduction over
`event_ndims - min_event_ndims` (`forward_min_event_ndims` for
`forward_log_det_jacobian` and `inverse_min_event_ndims` for
`inverse_log_det_jacobian`).

To see this, consider the `Exp` `Bijector` applied to a `Tensor` which has
sample, batch, and event (S, B, E) shape semantics. Suppose the `Tensor`'s
partitioned-shape is `(S=[4], B=[2], E=[3, 3])`. The shape of the `Tensor`
returned by `forward` and `inverse` is unchanged, i.e., `[4, 2, 3, 3]`.
However the shape returned by `inverse_log_det_jacobian` is `[4, 2]` because
the Jacobian determinant is a reduction over the event dimensions.

Another example is the `Affine` `Bijector`. Because `min_event_ndims = 1`, the
Jacobian determinant reduction is over `event_ndims - 1`.

It is sometimes useful to implement the inverse Jacobian determinant as the
negative forward Jacobian determinant. For example,

```python
def _inverse_log_det_jacobian(self, y):
   return -self._forward_log_det_jac(self._inverse(y))  # Note negation.
```

The correctness of this approach can be seen from the following claim.

- Claim:

    Assume `Y = g(X)` is a bijection whose derivative exists and is nonzero
    for its domain, i.e., `dY/dX = d/dX g(X) != 0`. Then:

    ```none
    (log o det o jacobian o g^{-1})(Y) = -(log o det o jacobian o g)(X)
    ```

- Proof:

    From the bijective, nonzero differentiability of `g`, the
    [inverse function theorem](
        https://en.wikipedia.org/wiki/Inverse_function_theorem)
    implies `g^{-1}` is differentiable in the image of `g`.
    Applying the chain rule to `y = g(x) = g(g^{-1}(y))` yields
    `I = g'(g^{-1}(y))*g^{-1}'(y)`.
    The same theorem also implies `g^{-1}'` is non-singular therefore:
    `inv[ g'(g^{-1}(y)) ] = g^{-1}'(y)`.
    The claim follows from [properties of determinant](
https://en.wikipedia.org/wiki/Determinant#Multiplicativity_and_matrix_groups).

Generally it's preferable to directly implement the inverse Jacobian
determinant.  This should have superior numerical stability and will often
share subgraphs with the `_inverse` implementation.

Note that Jacobian determinants are always a single Tensor (potentially with
batch dimensions), even for bijectors that act on multipart structures, since
any multipart transformation may be viewed as a transformation on a single
(possibly batched) vector obtained by flattening and
concatenating the input parts.

#### Is_constant_jacobian

Certain bijectors will have constant jacobian matrices. For instance, the
`Affine` bijector encodes multiplication by a matrix plus a shift, with
jacobian matrix, the same aforementioned matrix.

`is_constant_jacobian` encodes the fact that the jacobian matrix is constant.
The semantics of this argument are the following:

  * Repeated calls to 'log_det_jacobian' functions with the same
    `event_ndims` (but not necessarily same input), will return the first
    computed jacobian (because the matrix is constant, and hence is input
    independent).
  * `log_det_jacobian` implementations are merely broadcastable to the true
    `log_det_jacobian` (because, again, the jacobian matrix is input
    independent). Specifically, `log_det_jacobian` is implemented as the
    log jacobian determinant for a single input.

    ```python
    class Identity(Bijector):

      def __init__(self, validate_args=False, name='identity'):
        super(Identity, self).__init__(
            is_constant_jacobian=True,
            validate_args=validate_args,
            forward_min_event_ndims=0,
            name=name)

      def _forward(self, x):
        return x

      def _inverse(self, y):
        return y

      def _inverse_log_det_jacobian(self, y):
        return -self._forward_log_det_jacobian(self._inverse(y))

      def _forward_log_det_jacobian(self, x):
        # The full log jacobian determinant would be tf.zero_like(x).
        # However, we circumvent materializing that, since the jacobian
        # calculation is input independent, and we specify it for one input.
        return tf.constant(0., x.dtype)

    ```

#### Subclass Requirements

- Subclasses typically implement:

    - `_forward`,
    - `_inverse`,
    - `_inverse_log_det_jacobian`,
    - `_forward_log_det_jacobian` (optional),
    - `_is_increasing` (scalar bijectors only)

  The `_forward_log_det_jacobian` is called when the bijector is inverted via
  the `Invert` bijector. If undefined, a slightly less efficiently
  calculation, `-1 * _inverse_log_det_jacobian`, is used.

  If the bijector changes the shape of the input, you must also implement:

    - _forward_event_shape_tensor,
    - _forward_event_shape (optional),
    - _inverse_event_shape_tensor,
    - _inverse_event_shape (optional).

  By default the event-shape is assumed unchanged from input.

  Multipart bijectors, which operate on structures of tensors, may implement
  additional methods to propogate calltime dtype information over any changes
  to structure. These methods are:

    - _forward_dtype
    - _inverse_dtype
    - _forward_event_ndims
    - _inverse_event_ndims

- If the `Bijector`'s use is limited to `TransformedDistribution` (or friends
  like `QuantizedDistribution`) then depending on your use, you may not need
  to implement all of `_forward` and `_inverse` functions.

  Examples:

    1. Sampling (e.g., `sample`) only requires `_forward`.
    2. Probability functions (e.g., `prob`, `cdf`, `survival`) only require
       `_inverse` (and related).
    3. Only calling probability functions on the output of `sample` means
      `_inverse` can be implemented as a cache lookup.

  See 'Example Uses' [above] which shows how these functions are used to
  transform a distribution. (Note: `_forward` could theoretically be
  implemented as a cache lookup but this would require controlling the
  underlying sample generation mechanism.)

#### Non Injective Transforms

Warning: Handling of non-injective transforms is subject to change.

Non injective maps `g` are supported, provided their domain `D` can be
partitioned into `k` disjoint subsets, `Union{D1, ..., Dk}`, such that,
ignoring sets of measure zero, the restriction of `g` to each subset is a
differentiable bijection onto `g(D)`.  In particular, this implies that for
`y in g(D)`, the set inverse, i.e. `g^{-1}(y) = {x in D : g(x) = y}`, always
contains exactly `k` distinct points.

The property, `_is_injective` is set to `False` to indicate that the bijector
is not injective, yet satisfies the above condition.

The usual bijector API is modified in the case `_is_injective is False` (see
method docstrings for specifics).  Here we show by example the `AbsoluteValue`
bijector.  In this case, the domain `D = (-inf, inf)`, can be partitioned
into `D1 = (-inf, 0)`, `D2 = {0}`, and `D3 = (0, inf)`.  Let `gi` be the
restriction of `g` to `Di`, then both `g1` and `g3` are bijections onto
`(0, inf)`, with `g1^{-1}(y) = -y`, and `g3^{-1}(y) = y`.  We will use
`g1` and `g3` to define bijector methods over `D1` and `D3`.  `D2 = {0}` is
an oddball in that `g2` is one to one, and the derivative is not well defined.
Fortunately, when considering transformations of probability densities
(e.g. in `TransformedDistribution`), sets of measure zero have no effect in
theory, and only a small effect in 32 or 64 bit precision.  For that reason,
we define `inverse(0)` and `inverse_log_det_jacobian(0)` both as `[0, 0]`,
which is convenient and results in a left-semicontinuous pdf.


```python
abs = tfp.bijectors.AbsoluteValue()

abs.forward(-1.)
==> 1.

abs.forward(1.)
==> 1.

abs.inverse(1.)
==> (-1., 1.)

# The |dX/dY| is constant, == 1.  So Log|dX/dY| == 0.
abs.inverse_log_det_jacobian(1., event_ndims=0)
==> (0., 0.)

# Special case handling of 0.
abs.inverse(0.)
==> (0., 0.)

abs.inverse_log_det_jacobian(0., event_ndims=0)
==> (0., 0.)
```

<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2"><h2 class="add-link">Args</h2></th></tr>

<tr>
<td>
`graph_parents`
</td>
<td>
Python list of graph prerequisites of this `Bijector`.
</td>
</tr><tr>
<td>
`is_constant_jacobian`
</td>
<td>
Python `bool` indicating that the Jacobian matrix is
not a function of the input.
</td>
</tr><tr>
<td>
`validate_args`
</td>
<td>
Python `bool`, default `False`. Whether to validate input
with asserts. If `validate_args` is `False`, and the inputs are invalid,
correct behavior is not guaranteed.
</td>
</tr><tr>
<td>
`dtype`
</td>
<td>
`tf.dtype` supported by this `Bijector`. `None` means dtype is not
enforced. For multipart bijectors, this value is expected to be the
same for all elements of the input and output structures.
</td>
</tr><tr>
<td>
`forward_min_event_ndims`
</td>
<td>
Python `integer` (structure) indicating the
minimum number of dimensions on which `forward` operates.
</td>
</tr><tr>
<td>
`inverse_min_event_ndims`
</td>
<td>
Python `integer` (structure) indicating the
minimum number of dimensions on which `inverse` operates. Will be set to
`forward_min_event_ndims` by default, if no value is provided.
</td>
</tr><tr>
<td>
`parameters`
</td>
<td>
Python `dict` of parameters used to instantiate this
`Bijector`. Bijector instances with identical types, names, and
`parameters` share an input/output cache. `parameters` dicts are
keyed by strings and are identical if their keys are identical and if
corresponding values have identical hashes (or object ids, for
unhashable objects).
</td>
</tr><tr>
<td>
`name`
</td>
<td>
The name to give Ops created by the initializer.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2"><h2 class="add-link">Raises</h2></th></tr>

<tr>
<td>
`ValueError`
</td>
<td>
If neither `forward_min_event_ndims` and
`inverse_min_event_ndims` are specified, or if either of them is
negative.
</td>
</tr><tr>
<td>
`ValueError`
</td>
<td>
If a member of `graph_parents` is not a `Tensor`.
</td>
</tr>
</table>





<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2"><h2 class="add-link">Attributes</h2></th></tr>

<tr>
<td>
`dtype`
</td>
<td>

</td>
</tr><tr>
<td>
`forward_min_event_ndims`
</td>
<td>
Returns the minimal number of dimensions bijector.forward operates on.

Multipart bijectors return structured `ndims`, which indicates the
expected structure of their inputs. Some multipart bijectors, notably
Composites, may return structures of `None`.
</td>
</tr><tr>
<td>
`graph_parents`
</td>
<td>
Returns this `Bijector`'s graph_parents as a Python list.
</td>
</tr><tr>
<td>
`has_static_min_event_ndims`
</td>
<td>
Returns True if the bijector has statically-known `min_event_ndims`.
</td>
</tr><tr>
<td>
`inverse_min_event_ndims`
</td>
<td>
Returns the minimal number of dimensions bijector.inverse operates on.

Multipart bijectors return structured `event_ndims`, which indicates the
expected structure of their outputs. Some multipart bijectors, notably
Composites, may return structures of `None`.
</td>
</tr><tr>
<td>
`is_constant_jacobian`
</td>
<td>
Returns true iff the Jacobian matrix is not a function of x.

Note: Jacobian matrix is either constant for both forward and inverse or
neither.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
Returns the string name of this `Bijector`.
</td>
</tr><tr>
<td>
`parameters`
</td>
<td>
Dictionary of parameters used to instantiate this `Bijector`.
</td>
</tr><tr>
<td>
`trainable_variables`
</td>
<td>

</td>
</tr><tr>
<td>
`validate_args`
</td>
<td>
Returns True if Tensor arguments will be validated.
</td>
</tr><tr>
<td>
`variables`
</td>
<td>

</td>
</tr>
</table>



## Methods

<h3 id="forward"><code>forward</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L1139-L1155">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward(
    x, name=&#x27;forward&#x27;, **kwargs
)
</code></pre>

Returns the forward `Bijector` evaluation, i.e., X = g(Y).


<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`x`
</td>
<td>
`Tensor` (structure). The input to the 'forward' evaluation.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
The name to give this op.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Named arguments forwarded to subclass implementation.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>
<tr class="alt">
<td colspan="2">
`Tensor` (structure).
</td>
</tr>

</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Raises</th></tr>

<tr>
<td>
`TypeError`
</td>
<td>
if `self.dtype` is specified and `x.dtype` is not
`self.dtype`.
</td>
</tr><tr>
<td>
`NotImplementedError`
</td>
<td>
if `_forward` is not implemented.
</td>
</tr>
</table>



<h3 id="forward_dtype"><code>forward_dtype</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L1444-L1471">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward_dtype(
    dtype=UNSPECIFIED, name=&#x27;forward_dtype&#x27;, **kwargs
)
</code></pre>

Returns the dtype returned by `forward` for the provided input.


<h3 id="forward_event_ndims"><code>forward_event_ndims</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L1502-L1509">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward_event_ndims(
    event_ndims, **kwargs
)
</code></pre>

Returns the number of event dimensions produced by `forward`.


<h3 id="forward_event_shape"><code>forward_event_shape</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L998-L1018">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward_event_shape(
    input_shape
)
</code></pre>

Shape of a single sample from a single batch as a `TensorShape`.

Same meaning as `forward_event_shape_tensor`. May be only partially defined.

<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`input_shape`
</td>
<td>
`TensorShape` (structure) indicating event-portion shape
passed into `forward` function.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`forward_event_shape_tensor`
</td>
<td>
`TensorShape` (structure) indicating
event-portion shape after applying `forward`. Possibly unknown.
</td>
</tr>
</table>



<h3 id="forward_event_shape_tensor"><code>forward_event_shape_tensor</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L961-L991">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward_event_shape_tensor(
    input_shape, name=&#x27;forward_event_shape_tensor&#x27;
)
</code></pre>

Shape of a single sample from a single batch as an `int32` 1D `Tensor`.


<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`input_shape`
</td>
<td>
`Tensor`, `int32` vector (structure) indicating event-portion
shape passed into `forward` function.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
name to give to the op
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`forward_event_shape_tensor`
</td>
<td>
`Tensor`, `int32` vector (structure)
indicating event-portion shape after applying `forward`.
</td>
</tr>
</table>



<h3 id="forward_log_det_jacobian"><code>forward_log_det_jacobian</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L1397-L1434">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward_log_det_jacobian(
    x, event_ndims=None, name=&#x27;forward_log_det_jacobian&#x27;, **kwargs
)
</code></pre>

Returns both the forward_log_det_jacobian.


<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`x`
</td>
<td>
`Tensor` (structure). The input to the 'forward' Jacobian determinant
evaluation.
</td>
</tr><tr>
<td>
`event_ndims`
</td>
<td>
Optional number of dimensions in the probabilistic events
being transformed; this must be greater than or equal to
`self.forward_min_event_ndims`. If `event_ndims` is specified, the
log Jacobian determinant is summed to produce a
scalar log-determinant for each event. Otherwise
(if `event_ndims` is `None`), no reduction is performed.
Multipart bijectors require *structured* event_ndims, such that the
batch rank `rank(y[i]) - event_ndims[i]` is the same for all
elements `i` of the structured input. In most cases (with the
exception of `tfb.JointMap`) they further require that
`event_ndims[i] - self.inverse_min_event_ndims[i]` is the same for
all elements `i` of the structured input.
Default value: `None` (equivalent to `self.forward_min_event_ndims`).
</td>
</tr><tr>
<td>
`name`
</td>
<td>
The name to give this op.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Named arguments forwarded to subclass implementation.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>
<tr class="alt">
<td colspan="2">
`Tensor` (structure), if this bijector is injective.
If not injective this is not implemented.
</td>
</tr>

</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Raises</th></tr>

<tr>
<td>
`TypeError`
</td>
<td>
if `y`'s dtype is incompatible with the expected output dtype.
</td>
</tr><tr>
<td>
`NotImplementedError`
</td>
<td>
if neither `_forward_log_det_jacobian`
nor {`_inverse`, `_inverse_log_det_jacobian`} are implemented, or
this is a non-injective bijector.
</td>
</tr><tr>
<td>
`ValueError`
</td>
<td>
if the value of `event_ndims` is not valid for this bijector.
</td>
</tr>
</table>



<h3 id="inverse"><code>inverse</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L1200-L1218">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse(
    y, name=&#x27;inverse&#x27;, **kwargs
)
</code></pre>

Returns the inverse `Bijector` evaluation, i.e., X = g^{-1}(Y).


<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`y`
</td>
<td>
`Tensor` (structure). The input to the 'inverse' evaluation.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
The name to give this op.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Named arguments forwarded to subclass implementation.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>
<tr class="alt">
<td colspan="2">
`Tensor` (structure), if this bijector is injective.
If not injective, returns the k-tuple containing the unique
`k` points `(x1, ..., xk)` such that `g(xi) = y`.
</td>
</tr>

</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Raises</th></tr>

<tr>
<td>
`TypeError`
</td>
<td>
if `y`'s structured dtype is incompatible with the expected
output dtype.
</td>
</tr><tr>
<td>
`NotImplementedError`
</td>
<td>
if `_inverse` is not implemented.
</td>
</tr>
</table>



<h3 id="inverse_dtype"><code>inverse_dtype</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L1473-L1500">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse_dtype(
    dtype=UNSPECIFIED, name=&#x27;inverse_dtype&#x27;, **kwargs
)
</code></pre>

Returns the dtype returned by `inverse` for the provided input.


<h3 id="inverse_event_ndims"><code>inverse_event_ndims</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L1511-L1518">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse_event_ndims(
    event_ndims, **kwargs
)
</code></pre>

Returns the number of event dimensions produced by `inverse`.


<h3 id="inverse_event_shape"><code>inverse_event_shape</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L1060-L1080">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse_event_shape(
    output_shape
)
</code></pre>

Shape of a single sample from a single batch as a `TensorShape`.

Same meaning as `inverse_event_shape_tensor`. May be only partially defined.

<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`output_shape`
</td>
<td>
`TensorShape` (structure) indicating event-portion shape
passed into `inverse` function.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`inverse_event_shape_tensor`
</td>
<td>
`TensorShape` (structure) indicating
event-portion shape after applying `inverse`. Possibly unknown.
</td>
</tr>
</table>



<h3 id="inverse_event_shape_tensor"><code>inverse_event_shape_tensor</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L1025-L1053">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse_event_shape_tensor(
    output_shape, name=&#x27;inverse_event_shape_tensor&#x27;
)
</code></pre>

Shape of a single sample from a single batch as an `int32` 1D `Tensor`.


<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`output_shape`
</td>
<td>
`Tensor`, `int32` vector (structure) indicating
event-portion shape passed into `inverse` function.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
name to give to the op
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`inverse_event_shape_tensor`
</td>
<td>
`Tensor`, `int32` vector (structure)
indicating event-portion shape after applying `inverse`.
</td>
</tr>
</table>



<h3 id="inverse_log_det_jacobian"><code>inverse_log_det_jacobian</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L1288-L1330">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse_log_det_jacobian(
    y, event_ndims=None, name=&#x27;inverse_log_det_jacobian&#x27;, **kwargs
)
</code></pre>

Returns the (log o det o Jacobian o inverse)(y).

Mathematically, returns: `log(det(dX/dY))(Y)`. (Recall that: `X=g^{-1}(Y)`.)

Note that `forward_log_det_jacobian` is the negative of this function,
evaluated at `g^{-1}(y)`.

<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`y`
</td>
<td>
`Tensor` (structure). The input to the 'inverse' Jacobian determinant
evaluation.
</td>
</tr><tr>
<td>
`event_ndims`
</td>
<td>
Optional number of dimensions in the probabilistic events
being transformed; this must be greater than or equal to
`self.inverse_min_event_ndims`. If `event_ndims` is specified, the
log Jacobian determinant is summed to produce a
scalar log-determinant for each event. Otherwise
(if `event_ndims` is `None`), no reduction is performed.
Multipart bijectors require *structured* event_ndims, such that the
batch rank `rank(y[i]) - event_ndims[i]` is the same for all
elements `i` of the structured input. In most cases (with the
exception of `tfb.JointMap`) they further require that
`event_ndims[i] - self.inverse_min_event_ndims[i]` is the same for
all elements `i` of the structured input.
Default value: `None` (equivalent to `self.inverse_min_event_ndims`).
</td>
</tr><tr>
<td>
`name`
</td>
<td>
The name to give this op.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Named arguments forwarded to subclass implementation.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`ildj`
</td>
<td>
`Tensor`, if this bijector is injective.
If not injective, returns the tuple of local log det
Jacobians, `log(det(Dg_i^{-1}(y)))`, where `g_i` is the restriction
of `g` to the `ith` partition `Di`.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Raises</th></tr>

<tr>
<td>
`TypeError`
</td>
<td>
if `x`'s dtype is incompatible with the expected inverse-dtype.
</td>
</tr><tr>
<td>
`NotImplementedError`
</td>
<td>
if `_inverse_log_det_jacobian` is not implemented.
</td>
</tr><tr>
<td>
`ValueError`
</td>
<td>
if the value of `event_ndims` is not valid for this bijector.
</td>
</tr>
</table>



<h3 id="parameter_properties"><code>parameter_properties</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L1087-L1106">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>@classmethod</code>
<code>parameter_properties(
    dtype=tf.float32
)
</code></pre>

Returns a dict mapping constructor arg names to property annotations.

This dict should include an entry for each of the bijector's
`Tensor`-valued constructor arguments.

<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`dtype`
</td>
<td>
Optional float `dtype` to assume for continuous-valued parameters.
Some constraining bijectors require advance knowledge of the dtype
because certain constants (e.g., `tfb.Softplus.low`) must be
instantiated with the same dtype as the values to be transformed.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`parameter_properties`
</td>
<td>
A
`str -> `tfp.python.internal.parameter_properties.ParameterProperties`
dict mapping constructor argument names to `ParameterProperties`
instances.
</td>
</tr>
</table>



<h3 id="__call__"><code>__call__</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L869-L954">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>__call__(
    value, name=None, **kwargs
)
</code></pre>

Applies or composes the `Bijector`, depending on input type.

This is a convenience function which applies the `Bijector` instance in
three different ways, depending on the input:

1. If the input is a `tfd.Distribution` instance, return
   `tfd.TransformedDistribution(distribution=input, bijector=self)`.
2. If the input is a `tfb.Bijector` instance, return
   `tfb.Chain([self, input])`.
3. Otherwise, return `self.forward(input)`

<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`value`
</td>
<td>
A `tfd.Distribution`, `tfb.Bijector`, or a (structure of) `Tensor`.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
Python `str` name given to ops created by this function.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Additional keyword arguments passed into the created
`tfd.TransformedDistribution`, `tfb.Bijector`, or `self.forward`.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`composition`
</td>
<td>
A `tfd.TransformedDistribution` if the input was a
`tfd.Distribution`, a `tfb.Chain` if the input was a `tfb.Bijector`, or
a (structure of) `Tensor` computed by `self.forward`.
</td>
</tr>
</table>


#### Examples

```python
sigmoid = tfb.Reciprocal()(
    tfb.Shift(shift=1.)(
      tfb.Exp()(
        tfb.Scale(scale=-1.))))
# ==> `tfb.Chain([
#         tfb.Reciprocal(),
#         tfb.Shift(shift=1.),
#         tfb.Exp(),
#         tfb.Scale(scale=-1.),
#      ])`  # ie, `tfb.Sigmoid()`

log_normal = tfb.Exp()(tfd.Normal(0, 1))
# ==> `tfd.TransformedDistribution(tfd.Normal(0, 1), tfb.Exp())`

tfb.Exp()([-1., 0., 1.])
# ==> tf.exp([-1., 0., 1.])
```

<h3 id="__eq__"><code>__eq__</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.13.0/tensorflow_probability/substrates/numpy/bijectors/bijector.py#L828-L859">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>__eq__(
    other
)
</code></pre>

Return self==value.