# tfp.experimental.substrates.numpy.bijectors.Softfloor

Compute a differentiable approximation to tf.math.floor.

Inherits From: Bijector

Given x, compute a differentiable approximation to tf.math.floor(x). It is parameterized by a temperature parameter t to control the closeness of the approximation at the cost of numerical stability of the inverse.

This Bijector has the following properties:

• This Bijector is a map between R to R.
• For t close to 0, this bijector mimics the identity function.
• For t approaching infinity, this bijector converges pointwise to tf.math.floor (except at integer points).

Note that for lower temperatures t, this bijector becomes more numerically unstable. In particular, the inverse for this bijector is not numerically stable at lower temperatures, because flooring is not a bijective function ( and hence any pointwise limit towards the floor function will start to have a non-numerically stable inverse).

#### Mathematical details

Let x be in [0.5, 1.5]. We would like to simulate the floor function on this interval. We will do this via a shifted and rescaled sigmoid.

floor(x) = 0 for x < 1 and floor(x) = 1 for x >= 1. If we take f(x) = sigmoid((x - 1.) / t), where t > 0, we can see that when t goes to zero, we get that when x > 1, the f(x) tends towards 1 while f(x) tends to 0 when x < 1, thus giving us a function that looks like the floor function. If we shift f(x) by -sigmoid(-0.5 / t) and rescale by 1 / (sigmoid(0.5 / t) - sigmoid(-0.5 / t)), we preserve the pointwise limit, but also fix f(0.5) = 0. and f(1.5) = 1..

Thus we can define softfloor(x, t) = a * sigmoid((x - 1.) / t) + b

where

• a = 1 / (sigmoid(0.5 / t) - sigmoid(-0.5 / t))
• b = -sigmoid(-0.5 / t) / (sigmoid(0.5 / t) - sigmoid(-0.5 / t))

The implementation of the Softfloor bijector follows this, with the caveat that we extend the function to all of the real line, by appropriately shifting this function for each integer.

#### Example use:

# High temperature.
soft_floor = Softfloor(temperature=100.)
x = [2.1, 3.2, 5.5]
soft_floor.forward(x)

# Low temperature. This acts like a floor.
soft_floor = Softfloor(temperature=0.01)
soft_floor.forward(x) # Should be close to [2., 3., 5.]

# Ceiling is just a shifted floor at non-integer points.
soft_ceiling = tfb.Chain(
[tfb.AffineScalar(1.),
tfb.Softfloor(temperature=1.)])
soft_ceiling.forward(x) # Should be close to [3., 5., 6.]

graph_parents Python list of graph prerequisites of this Bijector.
is_constant_jacobian Python bool indicating that the Jacobian matrix is not a function of the input.
validate_args Python bool, default False. Whether to validate input with asserts. If validate_args is False, and the inputs are invalid, correct behavior is not guaranteed.
dtype tf.dtype supported by this Bijector. None means dtype is not enforced.
forward_min_event_ndims Python integer indicating the minimum number of dimensions forward operates on.
inverse_min_event_ndims Python integer indicating the minimum number of dimensions inverse operates on. Will be set to forward_min_event_ndims by default, if no value is provided.
parameters Python dict of parameters used to instantiate this Bijector.
name The name to give Ops created by the initializer.

ValueError If neither forward_min_event_ndims and inverse_min_event_ndims are specified, or if either of them is negative.
ValueError If a member of graph_parents is not a Tensor.

dtype dtype of Tensors transformable by this distribution.
forward_min_event_ndims Returns the minimal number of dimensions bijector.forward operates on.
graph_parents Returns this Bijector's graph_parents as a Python list.
inverse_min_event_ndims Returns the minimal number of dimensions bijector.inverse operates on.
is_constant_jacobian Returns true iff the Jacobian matrix is not a function of x.

name Returns the string name of this Bijector.
parameters Dictionary of parameters used to instantiate this Bijector.
temperature

trainable_variables

validate_args Returns True if Tensor arguments will be validated.
variables

## Methods

### forward

View source

Returns the forward Bijector evaluation, i.e., X = g(Y).

Args
x Tensor. The input to the 'forward' evaluation.
name The name to give this op.
**kwargs Named arguments forwarded to subclass implementation.

Returns
Tensor.

Raises
TypeError if self.dtype is specified and x.dtype is not self.dtype.
NotImplementedError if _forward is not implemented.

### forward_dtype

View source

Returns the dtype of the output of the forward transformation.

Args
dtype tf.dtype, or nested structure of tf.dtypes, of the input to forward.
name The name to give this op.
**kwargs Named arguments forwarded to subclass implementation.

Returns
tf.dtype or nested structure of tf.dtypes of the output of forward.

### forward_event_shape

View source

Shape of a single sample from a single batch as a TensorShape.

Same meaning as forward_event_shape_tensor. May be only partially defined.

Args
input_shape TensorShape indicating event-portion shape passed into forward function.

Returns
forward_event_shape_tensor TensorShape indicating event-portion shape after applying forward. Possibly unknown.

### forward_event_shape_tensor

View source

Shape of a single sample from a single batch as an int32 1D Tensor.

Args
input_shape Tensor, int32 vector indicating event-portion shape passed into forward function.
name name to give to the op

Returns
forward_event_shape_tensor Tensor, int32 vector indicating event-portion shape after applying forward.

### forward_log_det_jacobian

View source

Returns both the forward_log_det_jacobian.

Args
x Tensor. The input to the 'forward' Jacobian determinant evaluation.
event_ndims Number of dimensions in the probabilistic events being transformed. Must be greater than or equal to self.forward_min_event_ndims. The result is summed over the final dimensions to produce a scalar Jacobian determinant for each event, i.e. it has shape rank(x) - event_ndims dimensions.
name The name to give this op.
**kwargs Named arguments forwarded to subclass implementation.

Returns
Tensor, if this bijector is injective. If not injective this is not implemented.

Raises
TypeError if self.dtype is specified and y.dtype is not self.dtype.
NotImplementedError if neither _forward_log_det_jacobian nor {_inverse, _inverse_log_det_jacobian} are implemented, or this is a non-injective bijector.

### inverse

View source

Returns the inverse Bijector evaluation, i.e., X = g^{-1}(Y).

Args
y Tensor. The input to the 'inverse' evaluation.
name The name to give this op.
**kwargs Named arguments forwarded to subclass implementation.

Returns
Tensor, if this bijector is injective. If not injective, returns the k-tuple containing the unique k points (x1, ..., xk) such that g(xi) = y.

Raises
TypeError if self.dtype is specified and y.dtype is not self.dtype.
NotImplementedError if _inverse is not implemented.

### inverse_dtype

View source

Returns the dtype of the output of the inverse transformation.

Args
dtype tf.dtype, or nested structure of tf.dtypes, of the input to inverse.
name The name to give this op.
**kwargs Named arguments forwarded to subclass implementation.

Returns
tf.dtype or nested structure of tf.dtypes of the output of inverse.

### inverse_event_shape

View source

Shape of a single sample from a single batch as a TensorShape.

Same meaning as inverse_event_shape_tensor. May be only partially defined.

Args
output_shape TensorShape indicating event-portion shape passed into inverse function.

Returns
inverse_event_shape_tensor TensorShape indicating event-portion shape after applying inverse. Possibly unknown.

### inverse_event_shape_tensor

View source

Shape of a single sample from a single batch as an int32 1D Tensor.

Args
output_shape Tensor, int32 vector indicating event-portion shape passed into inverse function.
name name to give to the op

Returns
inverse_event_shape_tensor Tensor, int32 vector indicating event-portion shape after applying inverse.

### inverse_log_det_jacobian

View source

Returns the (log o det o Jacobian o inverse)(y).

Mathematically, returns: log(det(dX/dY))(Y). (Recall that: X=g^{-1}(Y).)

Note that forward_log_det_jacobian is the negative of this function, evaluated at g^{-1}(y).

Args
y Tensor. The input to the 'inverse' Jacobian determinant evaluation.
event_ndims Number of dimensions in the probabilistic events being transformed. Must be greater than or equal to self.inverse_min_event_ndims. The result is summed over the final dimensions to produce a scalar Jacobian determinant for each event, i.e. it has shape rank(y) - event_ndims dimensions.
name The name to give this op.
**kwargs Named arguments forwarded to subclass implementation.

Returns
ildj Tensor, if this bijector is injective. If not injective, returns the tuple of local log det Jacobians, log(det(Dg_i^{-1}(y))), where g_i is the restriction of g to the ith partition Di.

Raises
TypeError if self.dtype is specified and y.dtype is not self.dtype.
NotImplementedError if _inverse_log_det_jacobian is not implemented.

### __call__

View source

Applies or composes the Bijector, depending on input type.

This is a convenience function which applies the Bijector instance in three different ways, depending on the input:

1. If the input is a tfd.Distribution instance, return tfd.TransformedDistribution(distribution=input, bijector=self).
2. If the input is a tfb.Bijector instance, return tfb.Chain([self, input]).
3. Otherwise, return self.forward(input)

Args
value A tfd.Distribution, tfb.Bijector, or a Tensor.
name Python str name given to ops created by this function.
**kwargs Additional keyword arguments passed into the created tfd.TransformedDistribution, tfb.Bijector, or self.forward.

Returns
composition A tfd.TransformedDistribution if the input was a tfd.Distribution, a tfb.Chain if the input was a tfb.Bijector, or a Tensor computed by self.forward.

#### Examples

sigmoid = tfb.Reciprocal()(
tfb.AffineScalar(shift=1.)(
tfb.Exp()(
tfb.AffineScalar(scale=-1.))))
# ==> `tfb.Chain([
#         tfb.Reciprocal(),
#         tfb.AffineScalar(shift=1.),
#         tfb.Exp(),
#         tfb.AffineScalar(scale=-1.),
#      ])`  # ie, `tfb.Sigmoid()`

log_normal = tfb.Exp()(tfd.Normal(0, 1))
# ==> `tfd.TransformedDistribution(tfd.Normal(0, 1), tfb.Exp())`

tfb.Exp()([-1., 0., 1.])
# ==> tf.exp([-1., 0., 1.])