View source on GitHub |
Returns an initializer performing "Xavier" initialization for weights.
tf.contrib.layers.xavier_initializer(
uniform=True, seed=None, dtype=tf.dtypes.float32
)
This function implements the weight initialization from:
Xavier Glorot and Yoshua Bengio (2010): Understanding the difficulty of training deep feedforward neural networks. International conference on artificial intelligence and statistics.
This initializer is designed to keep the scale of the gradients roughly the
same in all layers. In uniform distribution this ends up being the range:
x = sqrt(6. / (in + out)); [-x, x]
and for normal distribution a standard
deviation of sqrt(2. / (in + out))
is used.
Args | |
---|---|
uniform
|
Whether to use uniform or normal distributed random initialization. |
seed
|
A Python integer. Used to create random seeds. See
tf.compat.v1.set_random_seed for behavior.
|
dtype
|
The data type. Only floating point types are supported. |
Returns | |
---|---|
An initializer for a weight matrix. |