A preprocessing layer which randomly varies image width during training.
Inherits From: Layer
, Module
tf.keras.layers.RandomWidth(
factor, interpolation='bilinear', seed=None, **kwargs
)
This layer will randomly adjusts the width of a batch of images of a
batch of images by a random factor. The input should be a 3D (unbatched) or
4D (batched) tensor in the "channels_last"
image data format. Input pixel
values can be of any range (e.g. [0., 1.)
or [0, 255]
) and of interger
or floating point dtype. By default, the layer will output floats.
By default, this layer is inactive during inference.
For an overview and full list of preprocessing layers, see the preprocessing
guide.
Args |
factor
|
A positive float (fraction of original width), or a tuple of size
2 representing lower and upper bound for resizing vertically. When
represented as a single float, this value is used for both the upper and
lower bound. For instance, factor=(0.2, 0.3) results in an output with
width changed by a random amount in the range [20%, 30%] .
factor=(-0.2, 0.3) results in an output with width changed by a random
amount in the range [-20%, +30%] . factor=0.2 results in an output
with width changed by a random amount in the range [-20%, +20%] .
|
interpolation
|
String, the interpolation method. Defaults to bilinear .
Supports "bilinear" , "nearest" , "bicubic" , "area" , "lanczos3" ,
"lanczos5" , "gaussian" , "mitchellcubic" .
|
seed
|
Integer. Used to create a random seed.
|
|
3D
|
unbatched) or 4D (batched) tensor with shape
(..., height, width, channels) , in "channels_last" format.
|
Output shape |
3D
|
unbatched) or 4D (batched) tensor with shape
(..., height, random_width, channels) .
|
Attributes |
auto_vectorize
|
Control whether automatic vectorization occurs.
By default the call() method leverages the tf.vectorized_map()
function. Auto-vectorization can be disabled by setting
self.auto_vectorize = False in your __init__() method. When
disabled, call() instead relies on tf.map_fn() . For example:
class SubclassLayer(BaseImageAugmentationLayer):
def __init__(self):
super().__init__()
self.auto_vectorize = False
|