tf.keras.initializers.lecun_normal

tf.keras.initializers.lecun_normal(seed=None)


LeCun normal initializer.

It draws samples from a truncated normal distribution centered on 0 with stddev = sqrt(1 / fan_in) where fan_in is the number of input units in the weight tensor.

Arguments:

• seed: A Python integer. Used to seed the random generator.

Returns:

An initializer.


References: - Self-Normalizing Neural Networks - Efficient Backprop