Deploy machine learning models on mobile and edge devices
TensorFlow Lite is a mobile library for deploying models on mobile, microcontrollers and other edge devices.
How it works
Pick a model
Pick a new model or retrain an existing one.
Convert a TensorFlow model into a compressed flat buffer with the TensorFlow Lite Converter.
Take the compressed .tflite file and load it into a mobile or embedded device.
Quantize by converting 32-bit floats to more efficient 8-bit integers or run on GPU.
Identify hundreds of objects, including people, activities, animals, plants, and places.
Detect multiple objects with bounding boxes. Yes, dogs and cats too.
Use a state-of-the-art natural language model to answer questions based on the content of a given passage of text with BERT.