Deploy machine learning models on mobile and edge devices

TensorFlow Lite is a mobile library for deploying models on mobile, microcontrollers and other edge devices.

See the guide

Guides explain the concepts and components of TensorFlow Lite.

See examples

Explore TensorFlow Lite Android and iOS apps.

See tutorials

Learn how to use TensorFlow Lite for common use cases.

How it works

Pick a model

Pick a new model or retrain an existing one.

Convert

Convert a TensorFlow model into a compressed flat buffer with the TensorFlow Lite Converter.

Deploy

Take the compressed .tflite file and load it into a mobile or embedded device.

Optimize

Quantize by converting 32-bit floats to more efficient 8-bit integers or run on GPU.

Solutions to common problems

Explore optimized TF Lite models and on-device ML solutions for mobile and edge use cases.

Image classification

Identify hundreds of objects, including people, activities, animals, plants, and places.

Object detection

Detect multiple objects with bounding boxes. Yes, dogs and cats too.

Question answering

Use a state-of-the-art natural language model to answer questions based on the content of a given passage of text with BERT.

News & announcements

Check out our blog for additional updates, and subscribe to our TensorFlow newsletter to get the latest announcements sent directly to your inbox.