TensorFlow Lite uses TensorFlow models converted into a smaller, more efficient machine learning (ML) model format. You can use pre-trained models with TensorFlow Lite, modify existing models, or build your own TensorFlow models and then convert them to TensorFlow Lite format. TensorFlow Lite models can perform almost any task a regular TensorFlow model can do: object detection, natural language processing, pattern recognition, and more using a wide range of input data including images, video, audio, and text.

Skip to the Convert section for information about getting your model to run with TensorFlow Lite.
For guidance on getting models for your use case, keep reading.

You don't have to build a TensorFlow Lite model to start using machine learning on mobile or edge devices. Many already-built and optimized models are available for you to use right away in your application. You can start with using pre-trained models in TensorFlow Lite and move up to building custom models over time, as follows:

  1. Start developing machine learning features with already trained models.
  2. Modify existing TensorFlow Lite models using tools such as Model Maker.
  3. Build a custom model with TensorFlow tools and then convert it to TensorFlow Lite.

If you are trying to quickly implement features or utility tasks with machine learning, you should review the use cases supported by ML Kit before starting development with TensorFlow Lite. This development tool provides APIs you can call directly from mobile apps to complete common ML tasks such as barcode scanning and on-device translation. Using this method can help you get results fast. However, ML Kit has limited options for extending its capabilities. For more information, see the ML Kit developer documentation.

If building a custom model for your specific use case is your ultimate goal, you should start with developing and training a TensorFlow model or extending an existing one. Before you start your model development process, you should be aware of the constraints for TensorFlow Lite models and build your model with these constraints in mind:

  • Limited compute capabilities
  • Size of models
  • Size of data
  • Supported TensorFlow operations

For more detail about each of these constraints, see model design contraints in the Model build overview. For more information about building effective, compatible, high performance models for TensorFlow Lite, see Performance best practices.

Learn how to pick a pre-trained ML model to use with TensorFlow Lite.
Use TensorFlow Lite Model Maker to modify models using your training data.
Learn how to build custom TensorFlow models to use with TensorFlow Lite.