TensorFlow Lite in Google Play services (BETA)

TensorFlow Lite is available in the Google Play services API as a public beta on all Android devices running the current version of Play services, starting February 17, 2022. This API lets you run machine learning models without statically bundling TensorFlow Lite libraries into your app, allowing you to:

  • Reduce your app size
  • Gain improved performance from the latest, stable version of TensorFlow Lite

This page provides a brief overview on how to use the new TensorFlow Lite in Google Play services APIs in your Android app.

For more information about Google Play services, see the Google Play services website.

Add TensorFlow Lite to your app

You can use the TensorFlow Lite in Google Play services API by making a few changes to your app module dependencies, initializing the new API, and using a specific class as your interpreter object. The following instructions provide more details on how to modify your app code.

1. Add project dependencies

Add the following dependencies to your app project code to access the Play Services API for TensorFlow Lite:

dependencies {
...
    // Tensorflow Lite dependencies for Google Play services
    implementation 'com.google.android.gms:play-services-tflite-java:16.0.0-beta02'
    // Optional: include Tensorflow Lite Support Library
    implementation 'com.google.android.gms:play-services-tflite-support:16.0.0-beta02'
...
}

2. Add initialization of TensorFlow Lite

Initialize the TensorFlow Lite component of the Google Play services API before using the TensorFlow Lite APIs:

Kotlin

val initializeTask: Task<Void> by lazy { TfLite.initialize(this) }

Java

Task<Void> initializeTask = TfLite.initialize(context);

3. Create an Interpreter and set runtime option

Create an interpreter using InterpreterApi.create() and configure it to use Google Play services runtime, by calling InterpreterApi.Options.setRuntime(), as shown in the following example code:

Kotlin

import org.tensorflow.lite.InterpreterApi
import org.tensorflow.lite.InterpreterApi.Options.TfLiteRuntime
...
private lateinit var interpreter: InterpreterApi
...
initializeTask.addOnSuccessListener {
  val interpreterOption =
    InterpreterApi.Options().setRuntime(TfLiteRuntime.FROM_SYSTEM_ONLY)
  interpreter = InterpreterApi.create(
    modelBuffer,
    interpreterOption
  )}
  .addOnFailureListener { e ->
    Log.e("Interpreter", "Cannot initialize interpreter", e)
  }

Java

import org.tensorflow.lite.InterpreterApi
import org.tensorflow.lite.InterpreterApi.Options.TfLiteRuntime
...
private InterpreterApi interpreter;
...
initializeTask.addOnSuccessListener(a -> {
    interpreter = InterpreterApi.create(modelBuffer,
      new InterpreterApi.Options().setRuntime(TfLiteRuntime.FROM_SYSTEM_ONLY));
  })
  .addOnFailureListener(e -> {
    Log.e("Interpreter", String.format("Cannot initialize interpreter: %s",
          e.getMessage()));
  });

You should use the implementation above because it avoids blocking the Android user interface thread. If you need to manage thread execution more closely, you can add a Tasks.await() call to interpreter creation:

Kotlin

import androidx.lifecycle.lifecycleScope
...
lifecycleScope.launchWhenStarted { // uses coroutine
  initializeTask.await()
}

Java

@BackgroundThread
InterpreterApi initializeInterpreter() {
    Tasks.await(initializeTask);
    return InterpreterApi.create(...);
}

4. Run inferences

Using the interpreter object you created, call the run() method to generate an inference.

Kotlin

interpreter.run(inputBuffer, outputBuffer)

Java

interpreter.run(inputBuffer, outputBuffer);

TensorFlow Lite in Google Play services

The TensorFlow Lite in Google Play services API lets you access the actual TensorFlow Lite Java API in your app after you initialize it using a new method in Play services. This approach keeps you keep code changes for existing apps to a minimum, and simplifies new implementations. For more information about the API for this feature, see the TensorFlow Lite API reference

Migrating from Standalone TensorFlow Lite

If you are planning to migrate your app from standalone TensorFlow Lite to the Play services API, review the following additional guidance for updating your app project code:

  1. Review the Limitations section of this page to ensure your use case is supported.
  2. Prior to updating your code, do performance and accuracy checks for your models, particularly if you are using versions of TensorFlow Lite earlier than version 2.1, so you have a baseline to compare against the new implementation.
  3. If you have migrated all of your code to use the Play services API for TensorFlow Lite, you should remove the existing TensorFlow Lite runtime library dependencies (entries with org.tensorflow:tensorflow-lite:*) from your build.gradle file so that you can reduce your app size.
  4. Identify all occurrences of new Interpreter object creation in your code, and modify it so that it uses the InterpreterApi.create() call. This new API is asynchronous, which means in most cases it's not a drop-in replacement, and you must register a listener for when the call completes. Refer to the code snippet in Step 3 code.
  5. Add import org.tensorflow.lite.InterpreterApi; and import org.tensorflow.lite.InterpreterApi.Options.TfLiteRuntime; to any source files using the org.tensorflow.lite.Interpreter or org.tensorflow.lite.InterpreterApi classes.
  6. If any of the resulting calls to InterpreterApi.create() have only a single argument, append new InterpreterApi.Options() to the argument list.
  7. Append .setRuntime(TfLiteRuntime.FROM_SYSTEM_ONLY) to the last argument of any calls to InterpreterApi.create().
  8. Replace all other occurrences of the org.tensorflow.lite.Interpreter class with org.tensorflow.lite.InterpreterApi.

If you want to use standalone TensorFlow Lite and the Play services API side-by-side, you must use TensorFlow Lite 2.9 (or later). TensorFlow Lite 2.8 and earlier versions are not compatible with the Play services API version.

Example App

You can review and test an example implementation of TensorFlow Lite in Google Play services in the example app.

Testing

After implementing TensorFlow Lite in Google Play services, make sure to test your application and exercise the machine learning model functions of your app. If you experience errors or issues you are unable to resolve, please report them by using the channels outlined in the Support and feedback section below.

LoadingException: No acceptable module

While testing your app through a development environment during the Beta launch period, you may get an exception when your app attempts to initialize the TensorFlow Lite class (TfLite.intialize(context)):

com.google.android.gms.dynamite.DynamiteModule$LoadingException:
  No acceptable module com.google.android.gms.tflite_dynamite found.
  Local version is 0 and remote version is 0.

This error means that the TensorFlow Lite in Google Play services API is not yet available on your test device. You can resolve this exception by joining this Google group tflite-play-services-beta-access with the user account you are using to test on your device. Once you have been added to the beta access group, this exception should be resolved.

Allow at least one business day after you join this group for access to be granted and the error to clear. If you continue to experience this error, report it using the channels outlined in the Support and feedback section below.

Limitations

TensorFlow Lite in Google Play services is currently at public beta and has the following limitations:

  • Only the NNAPI Delegate is currently supported by Google Play services. Other TensorFlow Lite Delegates, including GPU, and Flex are not currently supported.
  • Access to TensorFlow Lite via native APIs is not supported. Only the TensorFlow Lite Java APIs are available through Google Play services.
  • Experimental or deprecated TensorFlow Lite APIs, including custom ops, are not supported.

Support and feedback

You can provide feedback and get support for this beta release feature, through the TensorFlow Issue Tracker. Please report issues and support requests using the Issue template for TensorFlow Lite in Google Play services.

Terms and Privacy Policy

Use of TensorFlow Lite in Google Play services is subject to the Google APIs Terms of Service. Note that TensorFlow Lite in Google Play services is in beta and, as such, its functionality as well as associated APIs may change without advance notice.

When you use TensorFlow Lite in Google Play services APIs, processing of the input data, such as images, video, text, fully happens on-device, and TensorFlow Lite in Google Play services does not send that data to Google servers. As a result, you can use our APIs for processing data that should not leave the device.

The TensorFlow Lite in Google Play services APIs may contact Google servers from time to time in order to receive things like bug fixes, updated models and hardware accelerator compatibility information. The TensorFlow Lite in Google Play services APIs also sends metrics about the performance and utilization of the APIs in your app to Google. Google uses this metrics data to measure performance, debug, maintain and improve the APIs, and detect misuse or abuse, as further described in our Privacy Policy.

You are responsible for informing users of your app about Google's processing of TensorFlow Lite in Google Play services metrics data as required by applicable law.

Data we collect includes the following:

  • Device information (such as manufacturer, model, OS version and build) and available ML hardware accelerators (GPU and DSP). Used for diagnostics and usage analytics.
  • Device identifier used for diagnostics and usage analytics.
  • App information (package name, app version). Used for diagnostics and usage analytics.
  • API configuration (such as which delegates are being used). Used for diagnostics and usage analytics.
  • Event type (such as interpreter creation, inference). Used for diagnostics and usage analytics.
  • Error codes. Used for diagnostics.
  • Performance metrics. Used for diagnostics.

Next steps

For more information about implementing machine learning in your mobile application with TensorFlow Lite, see the TensorFlow Lite Developer Guide. You can find additional TensorFlow Lite models for image classification, object detection, and other applications on the TensorFlow Hub.