透過集合功能整理內容
你可以依據偏好儲存及分類內容。
工具
探索可支援並加速 TensorFlow 工作流程的工具。
Colab
Colaboratory 是免費的 Jupyter 筆記本環境,不需任何設定、完全於雲端執行,你只要在瀏覽器輕按一下,就能執行 TensorFlow 程式碼。
Visual Blocks
A visual coding web framework to prototype ML workflows using I/O devices, models, data augmentation, and even Colab code as reusable building blocks.
What-If Tool
不必撰寫程式碼就能探測機器學習模型的工具,對於瞭解模型、進行偵錯和維持機器學習公平性很有幫助。TensorBoard 和 Jupyter/Colab 筆記本皆有提供這項工具。
ML Perf
廣泛全面的機器學習基準套件,可用於評估機器學習軟體架構、機器學習硬體加速器和機器學習雲端平台的效能。
XLA
XLA (加速線性代數) 是特定領域專用的線性代數編譯器,可將 TensorFlow 的運算最佳化,可改善伺服器和行動裝置平台上的速度、記憶體用量以及可攜性。
TPU Research Cloud
TPU Research Cloud (TRC) 計畫可讓研究人員申請免費存取具有超過 1,000 個 Cloud TPU 的叢集,加速取得下一波研究突破。
[null,null,[],[],[],null,["# Tools\n=====\n\nExplore tools to support and accelerate TensorFlow workflows. \n[Colab](https://colab.sandbox.google.com/notebooks/welcome.ipynb) \nColaboratory is a free Jupyter notebook environment that requires no setup and runs entirely in the cloud, allowing you to execute TensorFlow code in your browser with a single click. \n[Learn more](https://colab.sandbox.google.com/notebooks/welcome.ipynb) \n[Visual Blocks](https://visualblocks.withgoogle.com/) \nA visual coding web framework to prototype ML workflows using I/O devices, models, data augmentation, and even Colab code as reusable building blocks. \n[Learn more](https://visualblocks.withgoogle.com/) \n[TensorBoard](/tensorboard) \nA suite of visualization tools to understand, debug, and optimize TensorFlow programs. \n[Learn more](/tensorboard) [View code](https://github.com/tensorflow/tensorboard) \n[What-If Tool](https://pair-code.github.io/what-if-tool/) \nA tool for code-free probing of machine learning models, useful for model understanding, debugging, and fairness. Available in TensorBoard and jupyter or colab notebooks. \n[Learn more](https://pair-code.github.io/what-if-tool/) [Get started](https://colab.research.google.com/github/PAIR-code/what-if-tool/blob/master/What_If_Tool_Notebook_Usage.ipynb) \n[ML Perf](https://mlperf.org/) \nA broad ML benchmark suite for measuring performance of ML software frameworks, ML hardware accelerators, and ML cloud platforms. \n[Learn more](https://mlperf.org/) \n[XLA](/xla) \nXLA (Accelerated Linear Algebra) is a domain-specific compiler for linear algebra that optimizes TensorFlow computations. The results are improvements in speed, memory usage, and portability on server and mobile platforms. \n[Learn more](/xla) \n[TensorFlow Playground](https://playground.tensorflow.org/#activation=tanh&batchSize=10&dataset=circle®Dataset=reg-plane&learningRate=0.03®ularizationRate=0&noise=0&networkShape=4,2&seed=0.04620&showTestData=false&discretize=false&percTrainData=50&x=true&y=true&xTimesY=false&xSquared=false&ySquared=false&cosX=false&sinX=false&cosY=false&sinY=false&collectStats=false&problem=classification&initZero=false&hideText=false) \nTinker with a neural network in your browser. Don't worry, you can't break it. \n[Learn more](https://playground.tensorflow.org/#activation=tanh&batchSize=10&dataset=circle®Dataset=reg-plane&learningRate=0.03®ularizationRate=0&noise=0&networkShape=4,2&seed=0.04620&showTestData=false&discretize=false&percTrainData=50&x=true&y=true&xTimesY=false&xSquared=false&ySquared=false&cosX=false&sinX=false&cosY=false&sinY=false&collectStats=false&problem=classification&initZero=false&hideText=false) \n[TPU Research Cloud](https://sites.research.google/trc/) \nThe TPU Research Cloud (TRC) program enables researchers to apply for access to a cluster of more than 1,000 Cloud TPUs at no charge to help them accelerate the next wave of research breakthroughs. \n[Learn more](https://sites.research.google/trc/) \n[MLIR](/mlir) \nA new intermediate representation and compiler framework. \n[Learn more](/mlir) \n\nExplore libraries that build advanced models, methods, and extensions using TensorFlow\n--------------------------------------------------------------------------------------\n\n[See libraries](/resources/libraries-extensions)"]]