tf.keras.initializers.LecunUniform

Lecun uniform initializer.

Inherits From: VarianceScaling

Also available via the shortcut function tf.keras.initializers.lecun_uniform.

Draws samples from a uniform distribution within [-limit, limit], where limit = sqrt(3 / fan_in) (fan_in is the number of input units in the weight tensor).

Examples:

# Standalone usage:
initializer = tf.keras.initializers.LecunUniform()
values = initializer(shape=(2, 2))
# Usage in a Keras layer:
initializer = tf.keras.initializers.LecunUniform()
layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

seed A Python integer. An initializer created with a given seed will always produce the same random tensor for a given shape and dtype.

References:

Methods

from_config

View source

Instantiates an initializer from a configuration dictionary.

Example:

initializer = RandomUniform(-1, 1)
config = initializer.get_config()
initializer = RandomUniform.from_config(config)

Args
config A Python dictionary. It will typically be the output of get_config.

Returns
An Initializer instance.

get_config

View source

Returns the configuration of the initializer as a JSON-serializable dict.

Returns
A JSON-serializable Python dict.

__call__

View source

Returns a tensor object initialized as specified by the initializer.

Args
shape Shape of the tensor.
dtype Optional dtype of the tensor. Only floating point types are supported. If not specified, tf.keras.backend.floatx() is used, which default to float32 unless you configured it otherwise (via tf.keras.backend.set_floatx(float_dtype))