Missed TensorFlow World? Check out the recap. Learn more

tf.distribute.experimental.CollectiveCommunication

TensorFlow 2.0 version View source on GitHub

Class CollectiveCommunication

Communication choices for CollectiveOps.

Aliases:

  • Class tf.compat.v1.distribute.experimental.CollectiveCommunication
  • Class tf.compat.v2.distribute.experimental.CollectiveCommunication
  • AUTO: Default to runtime's automatic choices.
  • RING: TensorFlow's ring algorithms for all-reduce and all-gather.
  • NCCL: Use ncclAllReduce for all-reduce, and ring algorithms for all-gather. TODO(ayushd): add ncclAllGather implementation.

Class Members

  • AUTO
  • NCCL
  • RING