This page shows how you can start running TensorFlow Lite models with Python in just a few minutes. All you need is a TensorFlow model converted to TensorFlow Lite. (If you don't have a model converted yet, you can experiment using the model provided with the example linked below.)
About the TensorFlow Lite runtime package
To quickly start executing TensorFlow Lite models with Python, you can install
just the TensorFlow Lite interpreter, instead of all TensorFlow packages. We
call this simplified Python package
tflite_runtime package is a fraction the size of the full
package and includes the bare minimum code required to run inferences with
TensorFlow Lite—primarily the
Python class. This small package is ideal when all you want to do is execute
.tflite models and avoid wasting disk space with the large TensorFlow library.
Install TensorFlow Lite for Python
To install the TensorFlow Lite runtime package, run this command:
pip3 install --extra-index-url https://google-coral.github.io/py-repo/ tflite_runtime
If you're on a Raspberry Pi, this command might fail due to a known issue with
(#4011). So we suggest you
specify one of the
that matches your system. For example, if you're running Raspberry Pi OS 10
(which has Python 3.7), instead use this command:
pip3 install https://github.com/google-coral/pycoral/releases/download/release-frogfish/tflite_runtime-2.5.0-cp37-cp37m-linux_armv7l.whl
Run an inference using tflite_runtime
Instead of importing
Interpreter from the
tensorflow module, you now need to
import it from
For example, after you install the package above, copy and run the
file. It will (probably) fail because you don't have the
installed. To fix it, edit this line of the file:
import tensorflow as tf
So it instead reads:
import tflite_runtime.interpreter as tflite
And then change this line:
interpreter = tf.lite.Interpreter(model_path=args.model_file)
So it reads:
interpreter = tflite.Interpreter(model_path=args.model_file)
label_image.py again. That's it! You're now executing TensorFlow Lite
For more details about the
Interpreter API, read
Load and run a model in Python.
If you have a Raspberry Pi, try the classify_picamera.py example to perform image classification with the Pi Camera and TensorFlow Lite.
If you're using a Coral ML accelerator, check out the Coral examples on GitHub.
To convert other TensorFlow models to TensorFlow Lite, read about the the TensorFlow Lite Converter.