Deploy machine learning models on mobile and IoT devices

TensorFlow Lite is an open source deep learning framework for on-device inference.

See the guide

Guides explain the concepts and components of TensorFlow Lite.

See examples

Explore TensorFlow Lite Android and iOS apps.

See models

Easily deploy pre-trained models.

How it works

Pick a model

Pick a new model or retrain an existing one.


Convert a TensorFlow model into a compressed flat buffer with the TensorFlow Lite Converter.


Take the compressed .tflite file and load it into a mobile or embedded device.


Quantize by converting 32-bit floats to more efficient 8-bit integers or run on GPU.

Solutions to common problems

Explore optimized models to help with common mobile and edge use cases.

Image classification

Identify hundreds of objects, including people, activities, animals, plants, and places.

Object detection

Detect multiple objects with bounding boxes. Yes, dogs and cats too.

Smart reply

Generate reply suggestions to input conversational chat messages.

News & announcements

Check out our blog for additional updates, and subscribe to our monthly TensorFlow newsletter to get the latest announcements sent directly to your inbox.

Apr 8, 2020 
Quantization Aware Training with TensorFlow Model Optimization Toolkit

QAT enables you to train and deploy models with the performance and size benefits of quantization, while retaining close to their original accuracy.

Apr 2, 2020 
TensorFlow Lite Core ML delegate enables faster inference on iPhones and iPads

Announcing a new TensorFlow Lite delegate that uses Apple's Core ML API to run floating-point models faster with the Neural Engine.

Mar 13, 2020 
TensorFlow Lite: ML for mobile and IoT devices (TF Dev Summit '20)

Tune in for our new exciting announcements for TFLite - now deployed on billions of devices in production.