Join us at TensorFlow World, Oct 28-31. Use code TF20 for 20% off select passes. Register now


TensorFlow 2.0 version View source on GitHub

Class CollectiveCommunication

Communication choices for CollectiveOps.


  • Class tf.compat.v1.distribute.experimental.CollectiveCommunication
  • Class tf.compat.v2.distribute.experimental.CollectiveCommunication
  • AUTO: Default to runtime's automatic choices.
  • RING: TensorFlow's ring algorithms for all-reduce and all-gather.
  • NCCL: Use ncclAllReduce for all-reduce, and ring algorithms for all-gather. TODO(ayushd): add ncclAllGather implementation.

Class Members

  • AUTO
  • NCCL
  • RING