Help protect the Great Barrier Reef with TensorFlow on Kaggle Join Challenge

Python quickstart

Using TensorFlow Lite with Python is great for embedded devices based on Linux, such as Raspberry Pi and Coral devices with Edge TPU, among many others.

This page shows how you can start running TensorFlow Lite models with Python in just a few minutes. All you need is a TensorFlow model converted to TensorFlow Lite. (If you don't have a model converted yet, you can experiment using the model provided with the example linked below.)

About the TensorFlow Lite runtime package

To quickly start executing TensorFlow Lite models with Python, you can install just the TensorFlow Lite interpreter, instead of all TensorFlow packages. We call this simplified Python package tflite_runtime.

The tflite_runtime package is a fraction the size of the full tensorflow package and includes the bare minimum code required to run inferences with TensorFlow Lite—primarily the Interpreter Python class. This small package is ideal when all you want to do is execute .tflite models and avoid wasting disk space with the large TensorFlow library.

Install TensorFlow Lite for Python

If you're running Debian Linux or a derivative of Debian (including Raspberry Pi OS), you should install from our Debian package repo. This requires that you add a new repo list and key to your system and then install as follows:

echo "deb https://packages.cloud.google.com/apt coral-edgetpu-stable main" | sudo tee /etc/apt/sources.list.d/coral-edgetpu.list
curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add -
sudo apt-get update
sudo apt-get install python3-tflite-runtime

For all other systems, you can install with pip:

pip3 install --extra-index-url https://google-coral.github.io/py-repo/ tflite_runtime

If you'd like to manually install a Python wheel, you can select one from all tflite_runtime wheels.

Run an inference using tflite_runtime

Instead of importing Interpreter from the tensorflow module, you now need to import it from tflite_runtime.

For example, after you install the package above, copy and run the label_image.py file. It will (probably) fail because you don't have the tensorflow library installed. To fix it, edit this line of the file:

import tensorflow as tf

So it instead reads:

import tflite_runtime.interpreter as tflite

And then change this line:

interpreter = tf.lite.Interpreter(model_path=args.model_file)

So it reads:

interpreter = tflite.Interpreter(model_path=args.model_file)

Now run label_image.py again. That's it! You're now executing TensorFlow Lite models.

Learn more

For more details about the Interpreter API, read Load and run a model in Python.

If you have a Raspberry Pi, try the classify_picamera.py example to perform image classification with the Pi Camera and TensorFlow Lite.

If you're using a Coral ML accelerator, check out the Coral examples on GitHub.

To convert other TensorFlow models to TensorFlow Lite, read about the TensorFlow Lite Converter.

If you want to build tflite_runtime wheel, read Build TensorFlow Lite Python Wheel Package