tf.keras.initializers.lecun_normal
Stay organized with collections
Save and categorize content based on your preferences.
LeCun normal initializer.
tf.keras.initializers.lecun_normal(
seed=None
)
Initializers allow you to pre-specify an initialization strategy, encoded in
the Initializer object, without knowing the shape and dtype of the variable
being initialized.
Draws samples from a truncated normal distribution centered on 0 with stddev
= sqrt(1 / fan_in)
where fan_in
is the number of input units in the weight
tensor.
Examples:
def make_variables(k, initializer):
return (tf.Variable(initializer(shape=[k, k], dtype=tf.float32)),
tf.Variable(initializer(shape=[k, k, k], dtype=tf.float32)))
v1, v2 = make_variables(3, tf.initializers.lecun_normal())
v1
<tf.Variable ... shape=(3, 3) ...
v2
<tf.Variable ... shape=(3, 3, 3) ...
make_variables(4, tf.initializers.RandomNormal())
(<tf.Variable ... shape=(4, 4) dtype=float32...
<tf.Variable ... shape=(4, 4, 4) dtype=float32...
Arguments |
seed
|
A Python integer. Used to seed the random generator.
|
Returns |
A callable Initializer with shape and dtype arguments which generates a
tensor.
|
References:
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.keras.initializers.lecun_normal\n\n\u003cbr /\u003e\n\n|------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.2.0/tensorflow/python/ops/init_ops_v2.py#L823-L865) |\n\nLeCun normal initializer.\n\n#### View aliases\n\n\n**Main aliases**\n\n[`tf.initializers.lecun_normal`](/api_docs/python/tf/keras/initializers/LecunNormal)\n\n\u003cbr /\u003e\n\n tf.keras.initializers.lecun_normal(\n seed=None\n )\n\nInitializers allow you to pre-specify an initialization strategy, encoded in\nthe Initializer object, without knowing the shape and dtype of the variable\nbeing initialized.\n\nDraws samples from a truncated normal distribution centered on 0 with `stddev\n= sqrt(1 / fan_in)` where `fan_in` is the number of input units in the weight\ntensor.\n\n#### Examples:\n\n def make_variables(k, initializer):\n return (tf.Variable(initializer(shape=[k, k], dtype=tf.float32)),\n tf.Variable(initializer(shape=[k, k, k], dtype=tf.float32)))\n v1, v2 = make_variables(3, tf.initializers.lecun_normal())\n v1\n \u003ctf.Variable ... shape=(3, 3) ...\n v2\n \u003ctf.Variable ... shape=(3, 3, 3) ...\n make_variables(4, tf.initializers.RandomNormal())\n (\u003ctf.Variable ... shape=(4, 4) dtype=float32...\n \u003ctf.Variable ... shape=(4, 4, 4) dtype=float32...\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Arguments --------- ||\n|--------|------------------------------------------------------|\n| `seed` | A Python integer. Used to seed the random generator. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| A callable Initializer with `shape` and `dtype` arguments which generates a tensor. ||\n\n\u003cbr /\u003e\n\n#### References:\n\n- Self-Normalizing Neural Networks, [Klambauer et al., 2017](https://papers.nips.cc/paper/6698-self-normalizing-neural-networks) ([pdf](https://papers.nips.cc/paper/6698-self-normalizing-neural-networks.pdf))\n- Efficient Backprop, [Lecun et al., 1998](http://yann.lecun.com/exdb/publis/pdf/lecun-98b.pdf)"]]