tf.keras.initializers.LecunNormal

Lecun normal initializer.

Inherits From: VarianceScaling

Also available via the shortcut function tf.keras.initializers.lecun_normal.

Initializers allow you to pre-specify an initialization strategy, encoded in the Initializer object, without knowing the shape and dtype of the variable being initialized.

Draws samples from a truncated normal distribution centered on 0 with stddev = sqrt(1 / fan_in) where fan_in is the number of input units in the weight tensor.

Examples:

# Standalone usage:
initializer = tf.keras.initializers.LecunNormal()
values = initializer(shape=(2, 2))
# Usage in a Keras layer:
initializer = tf.keras.initializers.LecunNormal()
layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

seed A Python integer. Used to seed the random generator.

References:

Methods

from_config

View source

Instantiates an initializer from a configuration dictionary.

Example:

initializer = RandomUniform(-1, 1)
config = initializer.get_config()
initializer = RandomUniform.from_config(config)

Args
config A Python dictionary. It will typically be the output of get_config.

Returns
An Initializer instance.