tf.keras.layers.Dropout
Stay organized with collections
Save and categorize content based on your preferences.
Applies Dropout to the input.
Inherits From: Layer
tf.keras.layers.Dropout(
rate, noise_shape=None, seed=None, **kwargs
)
Dropout consists in randomly setting
a fraction rate
of input units to 0 at each update during training time,
which helps prevent overfitting.
Arguments |
rate
|
Float between 0 and 1. Fraction of the input units to drop.
|
noise_shape
|
1D integer tensor representing the shape of the
binary dropout mask that will be multiplied with the input.
For instance, if your inputs have shape
(batch_size, timesteps, features) and
you want the dropout mask to be the same for all timesteps,
you can use noise_shape=(batch_size, 1, features) .
|
seed
|
A Python integer to use as random seed.
|
Call arguments:
inputs
: Input tensor (of any rank).
training
: Python boolean indicating whether the layer should behave in
training mode (adding dropout) or in inference mode (doing nothing).
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.keras.layers.Dropout\n\n\u003cbr /\u003e\n\n|------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------|\n| [TensorFlow 2 version](/api_docs/python/tf/keras/layers/Dropout) | [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v1.15.0/tensorflow/python/keras/layers/core.py#L110-L179) |\n\nApplies Dropout to the input.\n\nInherits From: [`Layer`](../../../tf/keras/layers/Layer)\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.keras.layers.Dropout`](/api_docs/python/tf/keras/layers/Dropout), \\`tf.compat.v2.keras.layers.Dropout\\`\n\n\u003cbr /\u003e\n\n tf.keras.layers.Dropout(\n rate, noise_shape=None, seed=None, **kwargs\n )\n\nDropout consists in randomly setting\na fraction `rate` of input units to 0 at each update during training time,\nwhich helps prevent overfitting.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Arguments --------- ||\n|---------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `rate` | Float between 0 and 1. Fraction of the input units to drop. |\n| `noise_shape` | 1D integer tensor representing the shape of the binary dropout mask that will be multiplied with the input. For instance, if your inputs have shape `(batch_size, timesteps, features)` and you want the dropout mask to be the same for all timesteps, you can use `noise_shape=(batch_size, 1, features)`. |\n| `seed` | A Python integer to use as random seed. |\n\n\u003cbr /\u003e\n\n#### Call arguments:\n\n- **`inputs`**: Input tensor (of any rank).\n- **`training`**: Python boolean indicating whether the layer should behave in training mode (adding dropout) or in inference mode (doing nothing)."]]