Aprenda o que há de mais recente em aprendizado de máquina, IA generativa e muito mais no WiML Symposium 2023
Registre-se
Keras
Mantenha tudo organizado com as coleções
Salve e categorize o conteúdo com base nas suas preferências.
A tf.keras
é a API de alto nível do TensorFlow para criar
e treinar modelos de aprendizado profundo. Ela é usada para prototipagem rápida, pesquisa de ponta e produção, com três principais vantagens:
- Fácil de usar
A Keras tem uma interface simples e consistente otimizada para os casos de uso comuns. Ela fornece feedback claro e prático para os erros do usuário.
- Os modelos modulares e compostos
da Keras são feitos conectando
elementos configuráveis, com poucas restrições.
- Fácil de estender
Desenvolva elementos personalizados que expressem novas ideias para
pesquisa. Crie novas camadas, métricas e funções de perda e desenvolva
modelos de última geração.
O guia Keras: uma visão geral rápida ajudará você a dar os primeiros passos.
Para uma introdução ao machine learning com tf.keras
,
consulte esta série de tutoriais para iniciantes.
Para saber mais sobre a API, consulte o seguinte conjunto de guias que aborda o que você
precisa saber como usuário avançado da TensorFlow Keras:
Assista a Inside TensorFlow no YouTube para ver mais detalhes sobre a Keras:
Exceto em caso de indicação contrária, o conteúdo desta página é licenciado de acordo com a Licença de atribuição 4.0 do Creative Commons, e as amostras de código são licenciadas de acordo com a Licença Apache 2.0. Para mais detalhes, consulte as políticas do site do Google Developers. Java é uma marca registrada da Oracle e/ou afiliadas.
Última atualização 2020-06-05 UTC.
[null,null,["Última atualização 2020-06-05 UTC."],[],[],null,["# Keras: The high-level API for TensorFlow\n\n\u003cbr /\u003e\n\nKeras is the high-level API of the TensorFlow platform. It provides an\napproachable, highly-productive interface for solving machine learning (ML)\nproblems, with a focus on modern deep learning. Keras covers every step of the\nmachine learning workflow, from data processing to hyperparameter tuning to\ndeployment. It was developed with a focus on enabling fast experimentation.\n\nWith Keras, you have full access to the scalability and cross-platform\ncapabilities of TensorFlow. You can run Keras on a TPU Pod or large clusters of\nGPUs, and you can export Keras models to run in the browser or on mobile\ndevices. You can also serve Keras models via a web API.\n\nKeras is designed to reduce cognitive load by achieving the following goals:\n\n- Offer simple, consistent interfaces.\n- Minimize the number of actions required for common use cases.\n- Provide clear, actionable error messages.\n- Follow the principle of progressive disclosure of complexity: It's easy to get started, and you can complete advanced workflows by learning as you go.\n- Help you write concise, readable code.\n\nWho should use Keras\n--------------------\n\nThe short answer is that every TensorFlow user should use the Keras APIs by\ndefault. Whether you're an engineer, a researcher, or an ML practitioner, you\nshould start with Keras.\n\nThere are a few use cases (for example, building tools on top of TensorFlow or\ndeveloping your own high-performance platform) that require the low-level\n[TensorFlow Core APIs](https://www.tensorflow.org/guide/core). But if your use\ncase doesn't fall into one\nof the\n[Core API applications](https://www.tensorflow.org/guide/core#core_api_applications),\nyou should prefer Keras.\n\nKeras API components\n--------------------\n\nThe core data structures of Keras are [layers](https://keras.io/api/layers/) and\n[models](https://keras.io/api/models/). A layer is a simple input/output\ntransformation, and a model is a directed acyclic graph (DAG) of layers.\n\n### Layers\n\nThe `tf.keras.layers.Layer` class is the fundamental abstraction in Keras. A\n`Layer` encapsulates a state (weights) and some computation (defined in the\n`tf.keras.layers.Layer.call` method).\n\nWeights created by layers can be trainable or non-trainable. Layers are\nrecursively composable: If you assign a layer instance as an attribute of\nanother layer, the outer layer will start tracking the weights created by the\ninner layer.\n\nYou can also use layers to handle data preprocessing tasks like normalization\nand text vectorization. Preprocessing layers can be included directly into a\nmodel, either during or after training, which makes the model portable.\n\n### Models\n\nA model is an object that groups layers together and that can be trained on\ndata.\n\nThe simplest type of model is the\n[`Sequential` model](https://www.tensorflow.org/guide/keras/sequential_model),\nwhich is a linear stack of layers. For more complex architectures, you can\neither use the\n[Keras functional API](https://www.tensorflow.org/guide/keras/functional_api),\nwhich lets you build arbitrary graphs of layers, or\n[use subclassing to write models from scratch](https://www.tensorflow.org/guide/keras/making_new_layers_and_models_via_subclassing).\n\nThe `tf.keras.Model` class features built-in training and evaluation methods:\n\n- `tf.keras.Model.fit`: Trains the model for a fixed number of epochs.\n- `tf.keras.Model.predict`: Generates output predictions for the input samples.\n- `tf.keras.Model.evaluate`: Returns the loss and metrics values for the model; configured via the `tf.keras.Model.compile` method.\n\nThese methods give you access to the following built-in training features:\n\n- [Callbacks](https://www.tensorflow.org/api_docs/python/tf/keras/callbacks). You can leverage built-in callbacks for early stopping, model checkpointing, and [TensorBoard](https://www.tensorflow.org/tensorboard) monitoring. You can also [implement custom callbacks](https://www.tensorflow.org/guide/keras/writing_your_own_callbacks).\n- [Distributed training](https://www.tensorflow.org/guide/keras/distributed_training). You can easily scale up your training to multiple GPUs, TPUs, or devices.\n- Step fusing. With the `steps_per_execution` argument in `tf.keras.Model.compile`, you can process multiple batches in a single `tf.function` call, which greatly improves device utilization on TPUs.\n\nFor a detailed overview of how to use `fit`, see the\n[training and evaluation guide](https://www.tensorflow.org/guide/keras/training_with_built_in_methods).\nTo learn how to customize the built-in training and evaluation loops, see\n[Customizing what happens in `fit()`](https://www.tensorflow.org/guide/keras/customizing_what_happens_in_fit).\n\n### Other APIs and tools\n\nKeras provides many other APIs and tools for deep learning, including:\n\n- [Optimizers](https://keras.io/api/optimizers/)\n- [Metrics](https://keras.io/api/metrics/)\n- [Losses](https://keras.io/api/losses/)\n- [Data loading utilities](https://keras.io/api/data_loading/)\n\nFor a full list of available APIs, see the\n[Keras API reference](https://keras.io/api/). To learn more about other Keras\nprojects and initiatives, see\n[The Keras ecosystem](https://keras.io/getting_started/ecosystem/).\n\nNext steps\n----------\n\nTo get started using Keras with TensorFlow, check out the following topics:\n\n- [The Sequential model](https://www.tensorflow.org/guide/keras/sequential_model)\n- [The Functional API](https://www.tensorflow.org/guide/keras/functional)\n- [Training \\& evaluation with the built-in methods](https://www.tensorflow.org/guide/keras/training_with_built_in_methods)\n- [Making new layers and models via subclassing](https://www.tensorflow.org/guide/keras/custom_layers_and_models)\n- [Serialization and saving](https://www.tensorflow.org/guide/keras/save_and_serialize)\n- [Working with preprocessing layers](https://www.tensorflow.org/guide/keras/preprocessing_layers)\n- [Customizing what happens in fit()](https://www.tensorflow.org/guide/keras/customizing_what_happens_in_fit)\n- [Writing a training loop from scratch](https://www.tensorflow.org/guide/keras/writing_a_training_loop_from_scratch)\n- [Working with RNNs](https://www.tensorflow.org/guide/keras/rnn)\n- [Understanding masking \\& padding](https://www.tensorflow.org/guide/keras/masking_and_padding)\n- [Writing your own callbacks](https://www.tensorflow.org/guide/keras/custom_callback)\n- [Transfer learning \\& fine-tuning](https://www.tensorflow.org/guide/keras/transfer_learning)\n- [Multi-GPU and distributed training](https://www.tensorflow.org/guide/keras/distributed_training)\n\nTo learn more about Keras, see the following topics at\n[keras.io](http://keras.io):\n\n- [About Keras](https://keras.io/about/)\n- [Introduction to Keras for Engineers](https://keras.io/getting_started/intro_to_keras_for_engineers/)\n- [Introduction to Keras for Researchers](https://keras.io/getting_started/intro_to_keras_for_researchers/)\n- [Keras API reference](https://keras.io/api/)\n- [The Keras ecosystem](https://keras.io/getting_started/ecosystem/)"]]