View source on GitHub |
Applies dropout to the input.
Inherits From: Layer
, Operation
tf.keras.layers.Dropout(
rate, noise_shape=None, seed=None, **kwargs
)
Used in the notebooks
Used in the guide | Used in the tutorials |
---|---|
The Dropout
layer randomly sets input units to 0 with a frequency of
rate
at each step during training time, which helps prevent overfitting.
Inputs not set to 0 are scaled up by 1 / (1 - rate)
such that the sum over
all inputs is unchanged.
Note that the Dropout
layer only applies when training
is set to True
in call()
, such that no values are dropped during inference.
When using model.fit
, training
will be appropriately set to True
automatically. In other contexts, you can set the argument explicitly
to True
when calling the layer.
(This is in contrast to setting trainable=False
for a Dropout
layer.
trainable
does not affect the layer's behavior, as Dropout
does
not have any variables/weights that can be frozen during training.)
Call arguments | |
---|---|
inputs
|
Input tensor (of any rank). |
training
|
Python boolean indicating whether the layer should behave in training mode (adding dropout) or in inference mode (doing nothing). |
Methods
from_config
@classmethod
from_config( config )
Creates a layer from its config.
This method is the reverse of get_config
,
capable of instantiating the same layer from the config
dictionary. It does not handle layer connectivity
(handled by Network), nor weights (handled by set_weights
).
Args | |
---|---|
config
|
A Python dictionary, typically the output of get_config. |
Returns | |
---|---|
A layer instance. |
symbolic_call
symbolic_call(
*args, **kwargs
)