tf.keras.initializers.VarianceScaling

Initializer capable of adapting its scale to the shape of weights tensors.

Inherits From: Initializer

Used in the notebooks

Used in the tutorials

Also available via the shortcut function tf.keras.initializers.variance_scaling.

With distribution="truncated_normal" or "untruncated_normal", samples are drawn from a truncated/untruncated normal distribution with a mean of zero and a standard deviation (after truncation, if used) stddev = sqrt(scale / n), where n is:

  • number of input units in the weight tensor, if mode="fan_in"
  • number of output units, if mode="fan_out"
  • average of the numbers of input and output units, if mode="fan_avg"

With distribution="uniform", samples are drawn from a uniform distribution within [-limit, limit], where limit = sqrt(3 * scale / n).

Examples:

# Standalone usage:
initializer = tf.keras.initializers.VarianceScaling(
scale=0.1, mode='fan_in', distribution='uniform')
values = initializer(shape=(2, 2))
# Usage in a Keras layer:
initializer = tf.keras.initializers.VarianceScaling(
scale=0.1, mode='fan_in', distribution='uniform')
layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

scale Scaling factor (positive float).
mode One of "fan_in", "fan_out", "fan_avg".
distribution Random distribution to use. One of "truncated_normal", "untruncate