Missed TensorFlow World? Check out the recap. Learn more


Class TPUDistributionStrategy

Defined in tensorflow/contrib/tpu/python/tpu/keras_support.py.

The strategy to run Keras model on TPU.



Construct a TPUDistributionStrategy.


  • tpu_cluster_resolver: Any instance of TPUClusterResolver. If None, will create one with '' as master address.
  • using_single_core: Bool. This is the debugging option, which might be removed in future once the model replication functionality is mature enough. If False (default behavior), the system automatically finds the best configuration, in terms of number of TPU cores, for the model replication, typically using all avaiable TPU cores. If overwrites as True, force the model replication using single core, i.e., no replication.


  • Exception: No TPU Found on the given worker.