TF 2.0 is out! Get hands-on practice at TF World, Oct 28-31. Use code TF20 for 20% off select passes. Register now

tf.contrib.tpu.TPUDistributionStrategy

View source on GitHub

Class TPUDistributionStrategy

The strategy to run Keras model on TPU.

__init__

View source

__init__(
    tpu_cluster_resolver=None,
    using_single_core=False
)

Construct a TPUDistributionStrategy.

Args:

  • tpu_cluster_resolver: Any instance of TPUClusterResolver. If None, will create one with '' as master address.
  • using_single_core: Bool. This is the debugging option, which might be removed in future once the model replication functionality is mature enough. If False (default behavior), the system automatically finds the best configuration, in terms of number of TPU cores, for the model replication, typically using all available TPU cores. If overwrites as True, force the model replication using single core, i.e., no replication.

Raises:

  • Exception: No TPU Found on the given worker.