TensorFlow Probability
컬렉션을 사용해 정리하기
내 환경설정을 기준으로 콘텐츠를 저장하고 분류하세요.
TensorFlow Probability는 TensorFlow에서 확률적 추론 및 통계 분석을 수행하기 위한 라이브러리입니다. TensorFlow 에코시스템의 일부인 TensorFlow Probability는 확률적 방법과 심층 네트워크의 통합, 자동 미분을 사용한 그래디언트 기반 추론, 하드웨어 가속(GPU) 및 분산 계산을 통한 대규모 데이터세트 및 모델로의 확장성을 제공합니다.
TensorFlow Probability를 시작하려면 설치 가이드를 참조하고 Python 노트북 튜토리얼를 살펴보세요.
구성 요소
확률적 머신러닝 도구는 다음과 같이 구성됩니다.
레이어 0: TensorFlow
수치 연산 - 특히, LinearOperator
클래스를 이용하면 특정 구조(대각선, 낮은 순위 등)를 활용할 수 있는 행렬 없는 구현이 가능하여 계산 효율이 개선됩니다. 이러한 연산은 TensorFlow Probability 팀에서 빌드 및 유지관리하며 핵심 TensorFlow의 tf.linalg
에 포함되어 있습니다.
레이어 1: 통계 구성 요소
레이어 2: 모델 구축
레이어 3: 확률적 추론
TensorFlow Probability는 현재 개발 중이며 인터페이스가 달라질 수 있습니다.
예
탐색에 나열된 Python 노트북 튜토리얼 외에도 사용 가능한 몇 가지 예제 스크립트가 있습니다.
문제 보고하기
TensorFlow Probability 문제 추적기를 사용하여 버그 또는 기능 요청을 보고하세요.
달리 명시되지 않는 한 이 페이지의 콘텐츠에는 Creative Commons Attribution 4.0 라이선스에 따라 라이선스가 부여되며, 코드 샘플에는 Apache 2.0 라이선스에 따라 라이선스가 부여됩니다. 자세한 내용은 Google Developers 사이트 정책을 참조하세요. 자바는 Oracle 및/또는 Oracle 계열사의 등록 상표입니다.
최종 업데이트: 2024-02-12(UTC)
[null,null,["최종 업데이트: 2024-02-12(UTC)"],[],[],null,["# TensorFlow Probability\n\n\u003cbr /\u003e\n\nTensorFlow Probability is a library for probabilistic reasoning and statistical\nanalysis in TensorFlow. As part of the TensorFlow ecosystem, TensorFlow\nProbability provides integration of probabilistic methods with deep networks,\ngradient-based inference using automatic differentiation, and scalability to\nlarge datasets and models with hardware acceleration (GPUs) and distributed\ncomputation.\n\nTo get started with TensorFlow Probability, see the\n[install guide](./install) and view the\n[Python notebook tutorials](https://github.com/tensorflow/probability/blob/main/tensorflow_probability/examples/jupyter_notebooks/).\n\nComponents\n----------\n\nOur probabilistic machine learning tools are structured as follows:\n\n### Layer 0: TensorFlow\n\n*Numerical operations* ---in particular, the `LinearOperator`\nclass---enables matrix-free implementations that can exploit a particular structure\n(diagonal, low-rank, etc.) for efficient computation. It is built and maintained\nby the TensorFlow Probability team and is part of\n[`tf.linalg`](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/python/ops/linalg)\nin core TensorFlow.\n\n### Layer 1: Statistical Building Blocks\n\n- *Distributions* ([`tfp.distributions`](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/python/distributions)): A large collection of probability distributions and related statistics with batch and [broadcasting](https://docs.scipy.org/doc/numpy-1.14.0/user/basics.broadcasting.html) semantics.\n- *Bijectors* ([`tfp.bijectors`](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/python/bijectors)): Reversible and composable transformations of random variables. Bijectors provide a rich class of transformed distributions, from classical examples like the [log-normal distribution](https://en.wikipedia.org/wiki/Log-normal_distribution) to sophisticated deep learning models such as [masked autoregressive flows](https://arxiv.org/abs/1705.07057).\n\n### Layer 2: Model Building\n\n- Joint Distributions (e.g., [`tfp.distributions.JointDistributionSequential`](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/python/distributions/joint_distribution_sequential.py)): Joint distributions over one or more possibly-interdependent distributions. For an introduction to modeling with TFP's `JointDistribution`s, check out [this colab](https://github.com/tensorflow/probability/blob/main/tensorflow_probability/examples/jupyter_notebooks/Modeling_with_JointDistribution.ipynb)\n- *Probabilistic layers* ([`tfp.layers`](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/python/layers)): Neural network layers with uncertainty over the functions they represent, extending TensorFlow layers.\n\n### Layer 3: Probabilistic Inference\n\n- *Markov chain Monte Carlo* ([`tfp.mcmc`](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/python/mcmc)): Algorithms for approximating integrals via sampling. Includes [Hamiltonian Monte Carlo](https://en.wikipedia.org/wiki/Hamiltonian_Monte_Carlo), random-walk Metropolis-Hastings, and the ability to build custom transition kernels.\n- *Variational Inference* ([`tfp.vi`](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/python/vi)): Algorithms for approximating integrals through optimization.\n- *Optimizers* ([`tfp.optimizer`](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/python/optimizer)): Stochastic optimization methods, extending TensorFlow Optimizers. Includes [Stochastic Gradient Langevin Dynamics](http://www.icml-2011.org/papers/398_icmlpaper.pdf).\n- *Monte Carlo* ([`tfp.monte_carlo`](https://github.com/tensorflow/probability/blob/main/tensorflow_probability/python/monte_carlo)): Tools for computing Monte Carlo expectations.\n\nTensorFlow Probability is under active development and interfaces may change.\n\nExamples\n--------\n\nIn addition to the\n[Python notebook tutorials](https://github.com/tensorflow/probability/blob/main/tensorflow_probability/examples/jupyter_notebooks/)\nlisted in the navigation, there are some example scripts available:\n\n- [Variational Autoencoders](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/examples/vae.py) ---Representation learning with a latent code and variational inference.\n- [Vector-Quantized Autoencoder](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/examples/vq_vae.py) ---Discrete representation learning with vector quantization.\n- [Bayesian Neural Networks](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/examples/bayesian_neural_network.py) ---Neural networks with uncertainty over their weights.\n- [Bayesian Logistic Regression](https://github.com/tensorflow/probability/tree/main/tensorflow_probability/examples/logistic_regression.py) ---Bayesian inference for binary classification.\n\nReport issues\n-------------\n\nReport bugs or feature requests using the\n[TensorFlow Probability issue tracker](https://github.com/tensorflow/probability/issues)."]]