Save the date! Google I/O returns May 18-20 Register now

Model compatibility for TF1/TF2

TF Hub model formats

TF Hub offers reusable model pieces that can be loaded back, built upon, and possibly be retrained in a TensorFlow program. These come in two different formats:

The model format can be found on the model page on tfhub.dev. Model loading/inference, fine-tuning or creation might not be supported in TF1/2 based on the model formats.

Compatibility of the TF1 Hub format

Operation TF1/ TF1 compat mode in TF2 [1] TF2
Loading / Inference Fully supported (complete TF1 Hub format loading guide)
m = hub.Module(handle)
outputs = m(inputs)
It's recommended to use either hub.load
m = hub.load(handle)
outputs = m.signatures["sig"](inputs)
or hub.KerasLayer
m = hub.KerasLayer(handle, signature="sig")
outputs = m(inputs)
Fine-tuning Fully supported (complete TF1 Hub format fine-tuning guide)
m = hub.Module(handle,
               trainable=True,
               tags=["train"]*is_training)
outputs = m(inputs)
Note: modules that don't need a separate train graph don't have a train tag.
Not supported
Creation Fully supported (see complete TF1 Hub format creation guide)
Note: The TF1 Hub format is geared towards TF1 and is only partially supported in TF2. Consider creating a TF2 SavedModel.
Not supported

Compatibility of TF2 SavedModel

Not supported before TF1.15.

Operation TF1.15/ TF1 compat mode in TF2 [1] TF2
Loading / Inference Use either hub.load
m = hub.load(handle)
outputs = m(inputs)
or hub.KerasLayer
m = hub.KerasLayer(handle)
outputs = m(inputs)
Fully supported (complete TF2 SavedModel loading guide). Use either hub.load
m = hub.load(handle)
outputs = m(inputs)
or hub.KerasLayer
m = hub.KerasLayer(handle)
outputs = m(inputs)
Fine-tuning Supported for a hub.KerasLayer used in tf.keras.Model when trained with Model.fit() or trained in an Estimator whose model_fn wraps the Model per the custom model_fn guide.
Note: hub.KerasLayer does not fill in graph collections like the old tf.compat.v1.layers or hub.Module APIs did.
Fully supported (complete TF2 SavedModel fine-tuning guide). Use either hub.load:
m = hub.load(handle)
outputs = m(inputs, training=is_training)
or hub.KerasLayer:
m =  hub.KerasLayer(handle, trainable=True)
outputs = m(inputs)
Creation The TF2 API tf.saved_model.save() can be called from within compat mode. Fully supported (see complete TF2 SavedModel creation guide)

[1] "TF1 compat mode in TF2" refers to the combined effect of importing TF2 with import tensorflow.compat.v1 as tf and running tf.disable_v2_behavior() as described in the TensorFlow Migration guide .