TensorFlow Probability
Stay organized with collections
Save and categorize content based on your preferences.
TensorFlow Probability is a library for probabilistic reasoning and statistical
analysis in TensorFlow. As part of the TensorFlow ecosystem, TensorFlow
Probability provides integration of probabilistic methods with deep networks,
gradient-based inference using automatic differentiation, and scalability to
large datasets and models with hardware acceleration (GPUs) and distributed
computation.
To get started with TensorFlow Probability, see the
install guide and view the
Python notebook tutorials.
Components
Our probabilistic machine learning tools are structured as follows:
Layer 0: TensorFlow
Numerical operations—in particular, the LinearOperator
class—enables matrix-free implementations that can exploit a particular structure
(diagonal, low-rank, etc.) for efficient computation. It is built and maintained
by the TensorFlow Probability team and is part of
tf.linalg
in core TensorFlow.
Layer 1: Statistical Building Blocks
Layer 2: Model Building
- Joint Distributions (e.g.,
tfp.distributions.JointDistributionSequential
):
Joint distributions over one or more possibly-interdependent distributions.
For an introduction to modeling with TFP's JointDistribution
s, check out
this colab
- Probabilistic layers
(
tfp.layers
):
Neural network layers with uncertainty over the functions they represent,
extending TensorFlow layers.
Layer 3: Probabilistic Inference
- Markov chain Monte Carlo
(
tfp.mcmc
):
Algorithms for approximating integrals via sampling. Includes
Hamiltonian Monte Carlo,
random-walk Metropolis-Hastings, and the ability to build custom transition
kernels.
- Variational Inference
(
tfp.vi
):
Algorithms for approximating integrals through optimization.
- Optimizers
(
tfp.optimizer
):
Stochastic optimization methods, extending TensorFlow Optimizers. Includes
Stochastic Gradient Langevin Dynamics.
- Monte Carlo
(
tfp.monte_carlo
):
Tools for computing Monte Carlo expectations.
TensorFlow Probability is under active development and interfaces may change.
Examples
In addition to the
Python notebook tutorials
listed in the navigation, there are some example scripts available:
Report issues
Report bugs or feature requests using the
TensorFlow Probability issue tracker.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2023-04-26 UTC.
[null,null,["Last updated 2023-04-26 UTC."],[],[],null,["# TensorFlow Probability\n\n\u003cbr /\u003e\n\nTensorFlow Probability is a library for probabilistic reasoning and statistical\nanalysis in TensorFlow. As part of the TensorFlow ecosystem, TensorFlow\nProbability provides integration of probabilistic methods with deep networks,\ngradient-based inference using automatic differentiation, and scalability to\nlarge datasets and models with hardware acceleration (GPUs) and distributed\ncomputation.\n\nTo get started with TensorFlow Probability, see the\n[install guide](./install) and view the\n[Python notebook tutorials](https://github.com/tensorflow/probability/blob/main/tensorflow_probability/examples/jupyter_notebooks/).\n\nComponents\n----------\n\nOur probabilistic machine learning tools are structured as follows:\n\n### Layer 0: TensorFlow\n\n*Numerical operations* ---in particular, the `LinearOperator`\nclass---enables matrix-free implementations that can exploit a particular structure\n(diagonal, low-rank, etc.) for efficient computation. It is built and maintained\nby the TensorFlow Probability team and is part of\n[`tf.linalg`](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/python/ops/linalg)\nin core TensorFlow.\n\n### Layer 1: Statistical Building Blocks\n\n- *Distributions* ([`tfp.distributions`](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/python/distributions)): A large collection of probability distributions and related statistics with batch and [broadcasting](https://docs.scipy.org/doc/numpy-1.14.0/user/basics.broadcasting.html) semantics.\n- *Bijectors* ([`tfp.bijectors`](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/python/bijectors)): Reversible and composable transformations of random variables. Bijectors provide a rich class of transformed distributions, from classical examples like the [log-normal distribution](https://en.wikipedia.org/wiki/Log-normal_distribution) to sophisticated deep learning models such as [masked autoregressive flows](https://arxiv.org/abs/1705.07057).\n\n### Layer 2: Model Building\n\n- Joint Distributions (e.g., [`tfp.distributions.JointDistributionSequential`](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/python/distributions/joint_distribution_sequential.py)): Joint distributions over one or more possibly-interdependent distributions. For an introduction to modeling with TFP's `JointDistribution`s, check out [this colab](https://github.com/tensorflow/probability/blob/main/tensorflow_probability/examples/jupyter_notebooks/Modeling_with_JointDistribution.ipynb)\n- *Probabilistic layers* ([`tfp.layers`](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/python/layers)): Neural network layers with uncertainty over the functions they represent, extending TensorFlow layers.\n\n### Layer 3: Probabilistic Inference\n\n- *Markov chain Monte Carlo* ([`tfp.mcmc`](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/python/mcmc)): Algorithms for approximating integrals via sampling. Includes [Hamiltonian Monte Carlo](https://en.wikipedia.org/wiki/Hamiltonian_Monte_Carlo), random-walk Metropolis-Hastings, and the ability to build custom transition kernels.\n- *Variational Inference* ([`tfp.vi`](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/python/vi)): Algorithms for approximating integrals through optimization.\n- *Optimizers* ([`tfp.optimizer`](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/python/optimizer)): Stochastic optimization methods, extending TensorFlow Optimizers. Includes [Stochastic Gradient Langevin Dynamics](http://www.icml-2011.org/papers/398_icmlpaper.pdf).\n- *Monte Carlo* ([`tfp.monte_carlo`](https://github.com/tensorflow/probability/blob/main/tensorflow_probability/python/monte_carlo)): Tools for computing Monte Carlo expectations.\n\nTensorFlow Probability is under active development and interfaces may change.\n\nExamples\n--------\n\nIn addition to the\n[Python notebook tutorials](https://github.com/tensorflow/probability/blob/main/tensorflow_probability/examples/jupyter_notebooks/)\nlisted in the navigation, there are some example scripts available:\n\n- [Variational Autoencoders](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/examples/vae.py) ---Representation learning with a latent code and variational inference.\n- [Vector-Quantized Autoencoder](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/examples/vq_vae.py) ---Discrete representation learning with vector quantization.\n- [Bayesian Neural Networks](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/examples/bayesian_neural_network.py) ---Neural networks with uncertainty over their weights.\n- [Bayesian Logistic Regression](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/examples/logistic_regression.py) ---Bayesian inference for binary classification.\n\nReport issues\n-------------\n\nReport bugs or feature requests using the\n[TensorFlow Probability issue tracker](https://github.com/tensorflow/probability/issues)."]]