Deploy machine learning models on mobile and IoT devices
TensorFlow Lite is an open source deep learning framework for on-device inference.
How it works
Convert
Convert a TensorFlow model into a compressed flat buffer with the TensorFlow Lite Converter.
Deploy
Take the compressed .tflite file and load it into a mobile or embedded device.
Optimize
Quantize by converting 32-bit floats to more efficient 8-bit integers or run on GPU.
Solutions to common problems
Explore optimized models to help with common mobile and edge use cases.

Identify hundreds of objects, including people, activities, animals, plants, and places.


Use a state-of-the-art natural language model to answer questions based on the content of a given passage of text with BERT.