iOS quickstart

To get started with TensorFlow Lite on iOS, we recommend exploring the following example.

iOS image classification example

For an explanation of the source code, you should also read TensorFlow Lite iOS image classification.

This example app uses image classification to continuously classify whatever it sees from the device's rear-facing camera. The application must be run on an iOS device.

Inference is performed using the TensorFlow Lite C++ API. The demo app classifies frames in real-time, displaying the top most probable classifications. It allows the user to choose between a floating point or quantized model, select the thread count, and decide whether to run on CPU, GPU, or via NNAPI.

Build in Xcode

To build the example in Xcode, follow the instructions in README.md.

Create your own Android app

To get started quickly writing your own iOS code, we recommend using our iOS image classification example as a starting point.

The following sections contain some useful information for working with TensorFlow Lite on iOS.

Use TensorFlow Lite from Objective-C and Swift

The example app provides an Objective-C wrapper on top of the C++ Tensorflow Lite library. This wrapper is required because currently there is no interoperability between Swift and C++. The wrapper is exposed to Swift via bridging so that the Tensorflow Lite methods can be called from Swift.

The wrapper is located in TensorflowLiteWrapper. It is not tightly coupled with the example code, so you can use it in your own iOS apps. It exposes the following interface:

@interface TfliteWrapper : NSObject

/**
 This method initializes the TfliteWrapper with the specified model file.
 */
- (instancetype)initWithModelFileName:(NSString *)fileName;

/**
 This method initializes the interpreter of TensorflowLite library with the specified model file
 that performs the inference.
 */
- (BOOL)setUpModelAndInterpreter;

/**
 This method gets a reference to the input tensor at an index.
 */
- (uint8_t *)inputTensorAtIndex:(int)index;

/**
 This method performs the inference by invoking the interpreter.
 */
- (BOOL)invokeInterpreter;

/**
 This method gets the output tensor at a specified index.
 */
- (uint8_t *)outputTensorAtIndex:(int)index;

/**
 This method sets the number of threads used by the interpreter to perform inference.
 */
- (void)setNumberOfThreads:(int)threadCount;

@end

To use these files in your own iOS app, copy them into your Xcode project.