View source on GitHub
|
Partitioner that allocates a minimum size per shard.
Inherits From: Partitioner
tf.distribute.experimental.partitioners.MinSizePartitioner(
min_shard_bytes=(256 << 10), max_shards=1, bytes_per_string=16
)
Used in the notebooks
| Used in the tutorials |
|---|
This partitioner ensures each shard has at least min_shard_bytes, and tries
to allocate as many shards as possible, i.e., keeping shard size as small as
possible. The maximum number of such shards (upper bound) is given by
max_shards.
Examples:
partitioner = MinSizePartitioner(min_shard_bytes=4, max_shards=2)partitions = partitioner(tf.TensorShape([6, 1]), tf.float32)[2, 1]partitioner = MinSizePartitioner(min_shard_bytes=4, max_shards=10)partitions = partitioner(tf.TensorShape([6, 1]), tf.float32)[6, 1]# use in ParameterServerStrategy# strategy = tf.distribute.experimental.ParameterServerStrategy(# cluster_resolver=cluster_resolver, variable_partitioner=partitioner)
Methods
__call__
__call__(
shape, dtype, axis=0
)
Partitions the given shape and returns the partition results.
Examples of a partitioner that allocates a fixed number of shards:
partitioner = FixedShardsPartitioner(num_shards=2)
partitions = partitioner(tf.TensorShape([10, 3], tf.float32), axis=0)
print(partitions) # [2, 0]
| Args | |
|---|---|
shape
|
a tf.TensorShape, the shape to partition.
|
dtype
|
a tf.dtypes.Dtype indicating the type of the partition value.
|
axis
|
The axis to partition along. Default: outermost axis. |
| Returns | |
|---|---|
| A list of integers representing the number of partitions on each axis, where i-th value correponds to i-th axis. |
View source on GitHub