TensorFlow 2.0 Beta is available Learn more


Class CollectiveCommunication

Communication choices for CollectiveOps.


  • Class tf.compat.v1.distribute.experimental.CollectiveCommunication
  • Class tf.compat.v2.distribute.experimental.CollectiveCommunication
  • Class tf.distribute.experimental.CollectiveCommunication

Defined in python/distribute/cross_device_ops.py.

  • AUTO: Default to runtime's automatic choices.
  • RING: TensorFlow's ring algorithms for all-reduce and all-gather.
  • NCCL: Use ncclAllReduce for all-reduce, and ring algorithms for all-gather. TODO(ayushd): add ncclAllGather implementation.

Class Members

  • AUTO
  • NCCL
  • RING