tf.contrib.layers.variance_scaling_initializer

View source on GitHub

Returns an initializer that generates tensors without scaling variance.

When initializing a deep network, it is in principle advantageous to keep the scale of the input variance constant, so it does not explode or diminish by reaching the final layer. This initializer use the following formula:

  if mode='FAN_IN': # Count only number of input connections.
    n = fan_in
  elif mode='FAN_OUT': # Count only number of output connections.
    n = fan_out
  elif mode='FAN_AVG': # Average number of inputs and output connections.
    n = (fan_in + fan_out)/2.0

    truncated_normal(shape, 0.0, stddev=sqrt(factor / n))

factor Float. A multiplicative factor.
mode String. 'FAN_IN', 'FAN_OUT', 'FAN_AVG'.
uniform Whether to use uniform or normal distributed random initialization.
seed A Python integer. Used to create random seeds. See tf.compat.v1.set_random_seed for behavior.
dtype The data type. Only floating point types are supported.

An initializer that generates tensors with unit variance.

ValueError if dtype is not a floating point type.
TypeError if mode is not in ['FAN_IN', 'FAN_OUT', 'FAN_AVG'].