tf.keras.layers.SpectralNormalization
Stay organized with collections
Save and categorize content based on your preferences.
Performs spectral normalization on the weights of a target layer.
Inherits From: Wrapper
, Layer
, Module
tf.keras.layers.SpectralNormalization(
layer, power_iterations=1, **kwargs
)
This wrapper controls the Lipschitz constant of the weights of a layer by
constraining their spectral norm, which can stabilize the training of GANs.
Args |
layer
|
A keras.layers.Layer instance that
has either a kernel (e.g. Conv2D , Dense ...)
or an embeddings attribute (Embedding layer).
|
power_iterations
|
int, the number of iterations during normalization.
|
Examples:
Wrap keras.layers.Conv2D
:
>>> x = np.random.rand(1, 10, 10, 1)
>>> conv2d = SpectralNormalization(tf.keras.layers.Conv2D(2, 2))
>>> y = conv2d(x)
>>> y.shape
TensorShape([1, 9, 9, 2])
Wrap keras.layers.Dense
:
>>> x = np.random.rand(1, 10, 10, 1)
>>> dense = SpectralNormalization(tf.keras.layers.Dense(10))
>>> y = dense(x)
>>> y.shape
TensorShape([1, 10, 10, 10])
Reference:
Methods
normalize_weights
View source
normalize_weights()
Generate spectral normalized weights.
This method will update the value of self.kernel
with the
spectral normalized value, so that the layer is ready for call()
.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2023-10-06 UTC.
[null,null,["Last updated 2023-10-06 UTC."],[],[],null,["# tf.keras.layers.SpectralNormalization\n\n\u003cbr /\u003e\n\n|-----------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/keras-team/keras/tree/v2.13.1/keras/layers/normalization/spectral_normalization.py#L26-L141) |\n\nPerforms spectral normalization on the weights of a target layer.\n\nInherits From: [`Wrapper`](../../../tf/keras/layers/Wrapper), [`Layer`](../../../tf/keras/layers/Layer), [`Module`](../../../tf/Module) \n\n tf.keras.layers.SpectralNormalization(\n layer, power_iterations=1, **kwargs\n )\n\nThis wrapper controls the Lipschitz constant of the weights of a layer by\nconstraining their spectral norm, which can stabilize the training of GANs.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|--------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `layer` | A [`keras.layers.Layer`](../../../tf/keras/layers/Layer) instance that has either a `kernel` (e.g. `Conv2D`, `Dense`...) or an `embeddings` attribute (`Embedding` layer). |\n| `power_iterations` | int, the number of iterations during normalization. |\n\n\u003cbr /\u003e\n\n#### Examples:\n\nWrap [`keras.layers.Conv2D`](../../../tf/keras/layers/Conv2D): \n\n \u003e\u003e\u003e x = np.random.rand(1, 10, 10, 1)\n \u003e\u003e\u003e conv2d = SpectralNormalization(tf.keras.layers.Conv2D(2, 2))\n \u003e\u003e\u003e y = conv2d(x)\n \u003e\u003e\u003e y.shape\n TensorShape([1, 9, 9, 2])\n\nWrap [`keras.layers.Dense`](../../../tf/keras/layers/Dense): \n\n \u003e\u003e\u003e x = np.random.rand(1, 10, 10, 1)\n \u003e\u003e\u003e dense = SpectralNormalization(tf.keras.layers.Dense(10))\n \u003e\u003e\u003e y = dense(x)\n \u003e\u003e\u003e y.shape\n TensorShape([1, 10, 10, 10])\n\n#### Reference:\n\n- [Spectral Normalization for GAN](https://arxiv.org/abs/1802.05957).\n\nMethods\n-------\n\n### `normalize_weights`\n\n[View source](https://github.com/keras-team/keras/tree/v2.13.1/keras/layers/normalization/spectral_normalization.py#L108-L136) \n\n normalize_weights()\n\nGenerate spectral normalized weights.\n\nThis method will update the value of `self.kernel` with the\nspectral normalized value, so that the layer is ready for `call()`."]]