Announcing the TensorFlow Dev Summit 2020 Learn more

Python quickstart

Using TensorFlow Lite with Python is great for embedded devices based on Linux, such as Raspberry Pi and Coral devices with Edge TPU, among many others.

This page shows how you can start running TensorFlow Lite models with Python in just a few minutes. All you need is a TensorFlow model converted to TensorFlow Lite. (If you don't have a model converted yet, you can experiment using the model provided with the example linked below.)

Install just the TensorFlow Lite interpreter

To quickly start executing TensorFlow Lite models with Python, you can install just the TensorFlow Lite interpreter, instead of all TensorFlow packages.

This interpreter-only package is a fraction the size of the full TensorFlow package and includes the bare minimum code required to run inferences with TensorFlow Lite—it includes only the tf.lite.Interpreter Python class. This small package is ideal when all you want to do is execute .tflite models and avoid wasting disk space with the large TensorFlow library.

To install just the interpreter, download the appropriate Python wheel for your system from the following table, and then install it with the pip install command.

For example, if you're setting up a Raspberry Pi (using Raspbian Buster, which has Python 3.7), install the Python wheel as follows (after you click to download the .whl file below):

pip3 install tflite_runtime-1.14.0-cp37-cp37m-linux_armv7l.whl
ARM 32ARM 64x86-64
Python 3.5 tflite_runtime-1.14.0-cp35-cp35m-linux_armv7l.whl tflite_runtime-1.14.0-cp35-cp35m-linux_aarch64.whl tflite_runtime-1.14.0-cp35-cp35m-linux_x86_64.whl
Python 3.6 N/A N/A tflite_runtime-1.14.0-cp36-cp36m-linux_x86_64.whl
Python 3.7 tflite_runtime-1.14.0-cp37-cp37m-linux_armv7l.whl tflite_runtime-1.14.0-cp37-cp37m-linux_aarch64.whl tflite_runtime-1.14.0-cp37-cp37m-linux_x86_64.whl

Run an inference using tflite_runtime

To distinguish this interpreter-only package from the full TensorFlow package (allowing both to be installed, if you choose), the Python module provided in the above wheel is named tflite_runtime.

So instead of importing Interpreter from the tensorflow module, you need to import it from tflite_runtime.

For example, after you install the package above, copy and run the file. It will (probably) fail because you don't have the tensorflow library installed. To fix it, edit this line of the file:

import tensorflow as tf

So it instead reads:

import tflite_runtime.interpreter as tflite

And then change this line:

interpreter = tf.lite.Interpreter(model_path=args.model_file)

So it reads:

interpreter = tflite.Interpreter(model_path=args.model_file)

Now run again. That's it! You're now executing TensorFlow Lite models.

Learn more

If you have a Raspberry Pi, try the example to perform image classification with the Pi Camera and TensorFlow Lite.

For more details about the Interpreter API, read Load and run a model in Python.

To convert other TensorFlow models to TensorFlow Lite, read about the the TensorFlow Lite Converter.