Deploy machine learning models on mobile and IoT devices
TensorFlow Lite is an open source deep learning framework for on-device inference.
How it works
Pick a model
Pick a new model or retrain an existing one.
Convert a TensorFlow model into a compressed flat buffer with the TensorFlow Lite Converter.
Take the compressed .tflite file and load it into a mobile or embedded device.
Quantize by converting 32-bit floats to more efficient 8-bit integers or run on GPU.
Identify hundreds of objects, including people, activities, animals, plants, and places.
Detect multiple objects with bounding boxes. Yes, dogs and cats too.
Use a state-of-the-art natural language model to answer questions based on the content of a given passage of text with BERT.
MoveNet is the state-of-the-art pose estimation model for mobile devices which can run in realtime on modern smartphones. Learn about recent updates and how you can do custom pose classification on Android, iOS and Raspberry Pi.
To bridge the gap between mobile and web ML development, you can easily deploy the TensorFlow Lite Task Library to the web with the power of WebAssembly.
The TensorFlow ecosystem enables companies like Edge Impulse to put artificial intelligence in the hands of domain experts who are building the next generation of consumer and industrial technologies.
Learn how to train a custom object detection model and deploy it to an Android app with just a few lines of code. All you need are Android Studio and a web browser. No machine learning knowledge is required.