Help protect the Great Barrier Reef with TensorFlow on Kaggle Join Challenge

Module: tf.compat.v1.distribute.experimental

Experimental Distribution Strategy library.


class CentralStorageStrategy: A one-machine strategy that puts all variables on a single device.

class CollectiveCommunication: Cross device communication implementation.

class CollectiveHints: Hints for collective operations like AllReduce.

class CommunicationImplementation: Cross device communication implementation.

class CommunicationOptions: Options for cross device communications like All-reduce.

class MultiWorkerMirroredStrategy: A distribution strategy for synchronous training on multiple workers.

class ParameterServerStrategy: An asynchronous multi-worker parameter server tf.distribute strategy.

class TPUStrategy: TPU distribution strategy implementation.