Using TensorFlow Lite with Python is great for embedded devices based on Linux, such as Raspberry Pi and Coral devices with Edge TPU, among many others.
This page shows how you can start running TensorFlow Lite models with Python in just a few minutes. All you need is a TensorFlow model converted to TensorFlow Lite. (If you don't have a model converted yet, you can experiment using the model provided with the example linked below.)
About the TensorFlow Lite runtime package
To quickly start executing TensorFlow Lite models with Python, you can install
just the TensorFlow Lite interpreter, instead of all TensorFlow packages. We
call this simplified Python package
tflite_runtime package is a fraction the size of the full
package and includes the bare minimum code required to run inferences with
TensorFlow Lite—primarily the
Python class. This small package is ideal when all you want to do is execute
.tflite models and avoid wasting disk space with the large TensorFlow library.
Install TensorFlow Lite for Python
You can install on Linux with pip:
python3 -m pip install tflite-runtime
tflite-runtime Python wheels are pre-built and provided for these
- Linux armv7l (e.g. Raspberry Pi 2, 3, 4 and Zero 2 running Raspberry Pi OS 32-bit)
- Linux aarch64 (e.g. Raspberry Pi 3, 4 running Debian ARM64)
- Linux x86_64
If you want to run TensorFlow Lite models on other platforms, you should either use the full TensorFlow package, or build the tflite-runtime package from source.
If you're using TensorFlow with the Coral Edge TPU, you should instead follow the appropriate Coral setup documentation.
Run an inference using tflite_runtime
Instead of importing
Interpreter from the
tensorflow module, you now need to
import it from
For example, after you install the package above, copy and run the
file. It will (probably) fail because you don't have the
installed. To fix it, edit this line of the file:
import tensorflow as tf
So it instead reads:
import tflite_runtime.interpreter as tflite
And then change this line:
interpreter = tf.lite.Interpreter(model_path=args.model_file)
So it reads:
interpreter = tflite.Interpreter(model_path=args.model_file)
label_image.py again. That's it! You're now executing TensorFlow Lite
For more details about the
InterpreterAPI, read Load and run a model in Python.
If you have a Raspberry Pi, check out a video series about how to run object detection on Raspberry Pi using TensorFlow Lite.
If you're using a Coral ML accelerator, check out the Coral examples on GitHub.
To convert other TensorFlow models to TensorFlow Lite, read about the TensorFlow Lite Converter.
If you want to build
tflite_runtimewheel, read Build TensorFlow Lite Python Wheel Package