tf.keras.initializers.GlorotNormal

The Glorot normal initializer, also called Xavier normal initializer.

Inherits From: VarianceScaling, Initializer

Also available via the shortcut function tf.keras.initializers.glorot_normal.

Draws samples from a truncated normal distribution centered on 0 with stddev = sqrt(2 / (fan_in + fan_out)) where fan_in is the number of input units in the weight tensor and fan_out is the number of output units in the weight tensor.

Examples:

# Standalone usage:
initializer = tf.keras.initializers.GlorotNormal()
values = initializer(shape=(2, 2))
# Usage in a Keras layer:
initializer = tf.keras.initializers.GlorotNormal()
layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

seed A Python integer. Used to make the behavior of the initializer deterministic. Note that a seeded initializer will not produce the same random values across multiple calls, but multiple initializers will produce the same sequence when constructed with the same seed value.

Methods

from_config

View source

Instantiates an initializer from a configuration dictionary.

Example:

initializer = RandomUniform(-1, 1)
config = initializer.get_config()
initializer = RandomUniform.from_config(config)

Args
config A Python dictionary, the output of get_config.

Returns
A tf.keras.initializers.Initializer instance.

get_config

View source

Returns the configuration of the initializer as a JSON-serializable dict.

Returns
A JSON-serializable Python dict.

__call__

View source

Returns a tensor object initialized as specified by the initializer.

Args
shape Shape of the tensor.
dtype Optional dtype of the tensor. Only floating point types are supported. If not specified, tf.keras.backend.floatx() is used, which default to float32 unless you configured it otherwise (via tf.keras.backend.set_floatx(float_dtype))
**kwargs Additional keyword arguments.