|  View source on GitHub | 
Lecun normal initializer.
Inherits From: VarianceScaling, Initializer
tf.keras.initializers.LecunNormal(
    seed=None
)
Also available via the shortcut function
tf.keras.initializers.lecun_normal.
Initializers allow you to pre-specify an initialization strategy, encoded in the Initializer object, without knowing the shape and dtype of the variable being initialized.
Draws samples from a truncated normal distribution centered on 0 with stddev
= sqrt(1 / fan_in) where fan_in is the number of input units in the weight
tensor.
Examples:
# Standalone usage:initializer = tf.keras.initializers.LecunNormal()values = initializer(shape=(2, 2))
# Usage in a Keras layer:initializer = tf.keras.initializers.LecunNormal()layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)
| Args | |
|---|---|
| seed | A Python integer. Used to create random seeds. See tf.compat.v1.set_random_seedfor behavior. Note that seeded
initializer will not produce same random values across multiple calls,
but multiple initializers will produce same sequence when constructed with
same seed value. | 
| References | |
|---|---|
Methods
from_config
@classmethodfrom_config( config )
Instantiates an initializer from a configuration dictionary.
Example:
initializer = RandomUniform(-1, 1)
config = initializer.get_config()
initializer = RandomUniform.from_config(config)
| Args | |
|---|---|
| config | A Python dictionary, the output of get_config. | 
| Returns | |
|---|---|
| A tf.keras.initializers.Initializerinstance. | 
get_config
get_config()
Returns the configuration of the initializer as a JSON-serializable dict.
| Returns | |
|---|---|
| A JSON-serializable Python dict. | 
__call__
__call__(
    shape, dtype=None, **kwargs
)
Returns a tensor object initialized as specified by the initializer.
| Args | |
|---|---|
| shape | Shape of the tensor. | 
| dtype | Optional dtype of the tensor. Only floating point types are
supported. If not specified, tf.keras.backend.floatx()is used, which
default tofloat32unless you configured it otherwise (viatf.keras.backend.set_floatx(float_dtype)) | 
| **kwargs | Additional keyword arguments. |