Module: tf.distribute.experimental

Experimental Distribution Strategy library.

Modules

coordinator module: Public API for tf.distribute.experimental.coordinator namespace.

partitioners module: Public API for tf.distribute.experimental.partitioners namespace.

rpc module: Public API for tf.distribute.experimental.rpc namespace.

Classes

class CentralStorageStrategy: A one-machine strategy that puts all variables on a single device.

class CollectiveCommunication: Cross device communication implementation.

class CollectiveHints: Hints for collective operations like AllReduce.

class CommunicationImplementation: Cross device communication implementation.

class CommunicationOptions: Options for cross device communications like All-reduce.

class MultiWorkerMirroredStrategy: A distribution strategy for synchronous training on multiple workers.

class ParameterServerStrategy: An multi-worker tf.distribute strategy with parameter servers.

class PreemptionCheckpointHandler: Preemption and error handler for synchronous training.

class PreemptionWatcher: Watch preemption signal and store it.

class TPUStrategy: Synchronous training on TPUs and TPU Pods.

class TerminationConfig: Customization of PreemptionCheckpointHandler for various platforms.

class ValueContext: A class wrapping information needed by a distribute function.