TensorFlow Lite for Microcontrollers is designed to run machine learning models on microcontrollers and other devices with only a few kilobytes of memory. The core runtime just fits in 16 KB on an Arm Cortex M3 and can run many basic models. It doesn't require operating system support, any standard C or C++ libraries, or dynamic memory allocation.
Why microcontrollers are important
Microcontrollers are typically small, low-powered computing devices that are embedded within hardware that requires basic computation. By bringing machine learning to tiny microcontrollers, we can boost the intelligence of billions of devices that we use in our lives, including household appliances and Internet of Things devices, without relying on expensive hardware or reliable internet connections, which is often subject to bandwidth and power constraints and results in high latency. This can also help preserve privacy, since no data leaves the device. Imagine smart appliances that can adapt to your daily routine, intelligent industrial sensors that understand the difference between problems and normal operation, and magical toys that can help kids learn in fun and delightful ways.
TensorFlow Lite for Microcontrollers is written in C++ 17 and requires a 32-bit platform. It has been tested extensively with many processors based on the Arm Cortex-M Series architecture, and has been ported to other architectures including ESP32. The framework is available as an Arduino library. It can also generate projects for development environments such as Mbed. It is open source and can be included in any C++ 17 project.
The following development boards are supported:
- Arduino Nano 33 BLE Sense
- SparkFun Edge
- STM32F746 Discovery kit
- Adafruit EdgeBadge
- Adafruit TensorFlow Lite for Microcontrollers Kit
- Adafruit Circuit Playground Bluefruit
- Espressif ESP32-DevKitC
- Espressif ESP-EYE
- Wio Terminal: ATSAMD51
- Himax WE-I Plus EVB Endpoint AI Development Board
- Synopsys DesignWare ARC EM Software Development Platform
- Sony Spresense
Explore the examples
Each example application is on
and has a
README.md file that explains how it can be deployed to its supported
platforms. Some examples also have end-to-end tutorials using a specific
platform, as given below:
- Hello World - Demonstrates the absolute basics of using TensorFlow Lite for Microcontrollers
- Micro speech - Captures audio with a microphone to detect the words "yes" and "no"
- Person detection - Captures camera data with an image sensor to detect the presence or absence of a person
The following steps are required to deploy and run a TensorFlow model on a microcontroller:
- Train a model:
- Generate a small TensorFlow model that can fit your target device and contains supported operations.
- Convert to a TensorFlow Lite model using the TensorFlow Lite converter.
- Convert to a C byte array using standard tools to store it in a read-only program memory on device.
- Run inference on device using the C++ library and process the results.
TensorFlow Lite for Microcontrollers is designed for the specific constraints of microcontroller development. If you are working on more powerful devices (for example, an embedded Linux device like the Raspberry Pi), the standard TensorFlow Lite framework might be easier to integrate.
The following limitations should be considered:
- Support for a limited subset of TensorFlow operations
- Support for a limited set of devices
- Low-level C++ API requiring manual memory management
- On device training is not supported
- Get started with microcontrollers to try the example application and learn how to use the API.
- Understand the C++ library to learn how to use the library in your own project.
- Build and convert models to learn more about training and converting models for deployment on microcontrollers.