Additional API references

Stay organized with collections Save and categorize content based on your preferences.

This section contains additional collections of API reference pages for projects and packages separate from the tensorflow package, but that do not have dedicated subsite pages.

The TensorFlow Models repository

The TensorFlow Models repository provides implementations of state-of-the-art (SOTA) models.

The official/projects directory contains a collection of SOTA models that use TensorFlow’s high-level API. They are intended to be well-maintained, tested, and kept up-to-date with the latest TensorFlow API.

The library code used to build and train these models is available as a pip package. You can install it using:

$ pip install tensorflow-models-official  # For the latest release
$ #or
$ pip install tf-models-nightly # For the nightly build

To install the package from source, refer to these instructions.

The tensorflow-models-official pip package contains two top-level modules: tensorflow_models and orbit. You can import them with:

import tensorflow_models as tfm
import orbit

Tensorflow Models

API reference.

The tensorflow_models module handles building models and configuring training. Application-specific functionality is available under and tfm.nlp.


API reference.

The orbit module defines a flexible and lightweight library for writing customized training loop code in TensorFlow. Orbit is flexible about the type of models it works with. You can use Orbit to train Keras Models (as an alternative to Keras', but you don't have to use Keras at all. Orbit integrates seamlessly with tf.distribute and supports running on different device types (CPU, GPU, and TPU).

TensorFlow Compression

API reference.

The TensorFlow Compression repository implements learnable compression algorithms that can be used to efficiently compress your data or models.

On Linux and Mac OS, the package can be installed with pip:

pip install tensorflow_compression

To install from source refer to these instructions.