tf.keras.initializers.GlorotNormal

The Glorot normal initializer, also called Xavier normal initializer.

Inherits From: VarianceScaling, Initializer

Used in the notebooks

Used in the tutorials

Also available via the shortcut function tf.keras.initializers.glorot_normal.

Draws samples from a truncated normal distribution centered on 0 with stddev = sqrt(2 / (fan_in + fan_out)) where fan_in is the number of input units in the weight tensor and fan_out is the number of output units in the weight tensor.

Examples:

# Standalone usage:
initializer = tf.keras.initializers.GlorotNormal()
values = initializer(shape=(2, 2))
# Usage in a Keras layer:
initializer = tf.keras.initializers.GlorotNormal()
layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

seed A Python integer. An initializer created with a given seed will always produce the same random tensor for a given shape and dtype.

References:

Glorot et al., 2010 (pdf)

Methods

from_config

View source

Instantiates an initializer from a configuration dictionary.

Example:

initializer = RandomUniform(-1, 1)
config = initializer.get_config()
initializer = RandomUniform.from_config(config)

Args
config A Python dictionary, the output of get_config.

Returns
A tf.keras.initializers.Initializer instance.

get_config

View source