View source on GitHub
|
Build a scale-and-shift function using a multi-layer neural network.
tfp.bijectors.real_nvp_default_template(
hidden_layers,
shift_only=False,
activation=tf.nn.relu,
name=None,
*args,
**kwargs
)
This will be wrapped in a make_template to ensure the variables are only
created once. It takes the d-dimensional input x[0:d] and returns the D-d
dimensional outputs loc ('mu') and log_scale ('alpha').
The default template does not support conditioning and will raise an
exception if condition_kwargs are passed to it. To use conditioning in
Real NVP bijector, implement a conditioned shift/scale template that
handles the condition_kwargs.
Returns | |
|---|---|
shift
|
Float-like Tensor of shift terms ('mu' in
[Papamakarios et al. (2016)][1]).
|
log_scale
|
Float-like Tensor of log(scale) terms ('alpha' in
[Papamakarios et al. (2016)][1]).
|
Raises | |
|---|---|
NotImplementedError
|
if rightmost dimension of inputs is unknown prior to
graph execution, or if condition_kwargs is not empty.
|
References
[1]: George Papamakarios, Theo Pavlakou, and Iain Murray. Masked Autoregressive Flow for Density Estimation. In Neural Information Processing Systems, 2017. https://arxiv.org/abs/1705.07057
View source on GitHub