Missed TensorFlow Dev Summit? Check out the video playlist. Watch recordings

Deploy machine learning models on mobile and IoT devices

TensorFlow Lite is an open source deep learning framework for on-device inference.

See the guide

Guides explain the concepts and components of TensorFlow Lite.

See examples

Explore TensorFlow Lite Android and iOS apps.

See models

Easily deploy pre-trained models.

How it works

Pick a model

Pick a new model or retrain an existing one.

Convert

Convert a TensorFlow model into a compressed flat buffer with the TensorFlow Lite Converter.

Deploy

Take the compressed .tflite file and load it into a mobile or embedded device.

Optimize

Quantize by converting 32-bit floats to more efficient 8-bit integers or run on GPU.

Solutions to common problems

Explore optimized models to help with common mobile and edge use cases.

Image classification

Identify hundreds of objects, including people, activities, animals, plants, and places.

Object detection

Detect multiple objects with bounding boxes. Yes, dogs and cats too.

Smart reply

Generate reply suggestions to input conversational chat messages.

News & announcements

See updates to help you with your work, and subscribe to our monthly TensorFlow newsletter to get the latest announcements sent directly to your inbox.

Dec 12, 2019 
Example on-device model personalization with TensorFlow Lite

Try out a new on-device transfer learning image classification example.

Aug 6, 2019 
Track human poses in real-time on Android

Build a human pose estimation app by detecting the positions of key body parts such as the position of a person’s elbows and/or knees.

Aug 5, 2019 
Introducing float16 quantization for the Model Optimization Toolkit

Post-training float16 quantization reduces TensorFlow Lite model sizes up to 50% while sacrificing very little accuracy - and is great for GPUs!