tf.keras.initializers.LecunNormal
Stay organized with collections
Save and categorize content based on your preferences.
Lecun normal initializer.
Inherits From: VarianceScaling
tf.keras.initializers.LecunNormal(
seed=None
)
Also available via the shortcut function
tf.keras.initializers.lecun_normal
.
Initializers allow you to pre-specify an initialization strategy, encoded in
the Initializer object, without knowing the shape and dtype of the variable
being initialized.
Draws samples from a truncated normal distribution centered on 0 with stddev
= sqrt(1 / fan_in)
where fan_in
is the number of input units in the weight
tensor.
Examples:
# Standalone usage:
initializer = tf.keras.initializers.LecunNormal()
values = initializer(shape=(2, 2))
# Usage in a Keras layer:
initializer = tf.keras.initializers.LecunNormal()
layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)
Arguments |
seed
|
A Python integer. Used to seed the random generator.
|
References:
Methods
from_config
View source
@classmethod
from_config(
config
)
Instantiates an initializer from a configuration dictionary.
Example:
initializer = RandomUniform(-1, 1)
config = initializer.get_config()
initializer = RandomUniform.from_config(config)
Args |
config
|
A Python dictionary.
It will typically be the output of get_config .
|
Returns |
An Initializer instance.
|
get_config
View source
get_config()
Returns the configuration of the initializer as a JSON-serializable dict.
Returns |
A JSON-serializable Python dict.
|
__call__
View source
__call__(
shape, dtype=None
)
Returns a tensor object initialized as specified by the initializer.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.keras.initializers.LecunNormal\n\n\u003cbr /\u003e\n\n|-------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.3.0/tensorflow/python/keras/initializers/initializers_v2.py#L579-L621) |\n\nLecun normal initializer.\n\nInherits From: [`VarianceScaling`](../../../tf/keras/initializers/VarianceScaling)\n\n#### View aliases\n\n\n**Main aliases**\n\n[`tf.initializers.LecunNormal`](/api_docs/python/tf/keras/initializers/LecunNormal), [`tf.initializers.lecun_normal`](/api_docs/python/tf/keras/initializers/LecunNormal), [`tf.keras.initializers.lecun_normal`](/api_docs/python/tf/keras/initializers/LecunNormal)\n\n\u003cbr /\u003e\n\n tf.keras.initializers.LecunNormal(\n seed=None\n )\n\nAlso available via the shortcut function\n[`tf.keras.initializers.lecun_normal`](../../../tf/keras/initializers/LecunNormal).\n\nInitializers allow you to pre-specify an initialization strategy, encoded in\nthe Initializer object, without knowing the shape and dtype of the variable\nbeing initialized.\n\nDraws samples from a truncated normal distribution centered on 0 with `stddev\n= sqrt(1 / fan_in)` where `fan_in` is the number of input units in the weight\ntensor.\n\n#### Examples:\n\n # Standalone usage:\n initializer = tf.keras.initializers.LecunNormal()\n values = initializer(shape=(2, 2))\n\n # Usage in a Keras layer:\n initializer = tf.keras.initializers.LecunNormal()\n layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Arguments --------- ||\n|--------|------------------------------------------------------|\n| `seed` | A Python integer. Used to seed the random generator. |\n\n\u003cbr /\u003e\n\n#### References:\n\n- Self-Normalizing Neural Networks, [Klambauer et al., 2017](https://papers.nips.cc/paper/6698-self-normalizing-neural-networks) ([pdf](https://papers.nips.cc/paper/6698-self-normalizing-neural-networks.pdf))\n- Efficient Backprop, [Lecun et al., 1998](http://yann.lecun.com/exdb/publis/pdf/lecun-98b.pdf)\n\nMethods\n-------\n\n### `from_config`\n\n[View source](https://github.com/tensorflow/tensorflow/blob/v2.3.0/tensorflow/python/ops/init_ops_v2.py#L70-L90) \n\n @classmethod\n from_config(\n config\n )\n\nInstantiates an initializer from a configuration dictionary.\n\n#### Example:\n\n initializer = RandomUniform(-1, 1)\n config = initializer.get_config()\n initializer = RandomUniform.from_config(config)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ||\n|----------|-----------------------------------------------------------------------|\n| `config` | A Python dictionary. It will typically be the output of `get_config`. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ||\n|---|---|\n| An Initializer instance. ||\n\n\u003cbr /\u003e\n\n### `get_config`\n\n[View source](https://github.com/tensorflow/tensorflow/blob/v2.3.0/tensorflow/python/keras/initializers/initializers_v2.py#L620-L621) \n\n get_config()\n\nReturns the configuration of the initializer as a JSON-serializable dict.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ||\n|---|---|\n| A JSON-serializable Python dict. ||\n\n\u003cbr /\u003e\n\n### `__call__`\n\n[View source](https://github.com/tensorflow/tensorflow/blob/v2.3.0/tensorflow/python/keras/initializers/initializers_v2.py#L387-L397) \n\n __call__(\n shape, dtype=None\n )\n\nReturns a tensor object initialized as specified by the initializer.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ||\n|---------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `shape` | Shape of the tensor. |\n| `dtype` | Optional dtype of the tensor. Only floating point types are supported. If not specified, [`tf.keras.backend.floatx()`](../../../tf/keras/backend/floatx) is used, which default to `float32` unless you configured it otherwise (via [`tf.keras.backend.set_floatx(float_dtype)`](../../../tf/keras/backend/set_floatx)) |\n\n\u003cbr /\u003e"]]