Smart reply

Get started

Our smart reply model generates reply suggestions based on chat messages. The suggestions are intended to be contextually relevant, one-touch responses that help the user to easily reply to an incoming message.

Download starter model

Sample application

There is a TensorFlow Lite sample application that demonstrates the smart reply model on Android.

View Android example

Read the GitHub page to learn how the app works. Inside this project, you will also learn how to build an app with custom C++ ops.

How it works

The model generates reply suggestions to conversational chat messages.

The on-device model comes with several benefits. It is:

  • Fast: The model resides on the device and does not require internet connectivity. Thus, inference is very fast and has an average latency of only a few milliseconds.
  • Resource efficient: The model has a small memory footprint on the device.
  • Privacy-friendly: User data never leaves the device.

Example output

Animation showing smart reply

Read more about this

Users