tft.scale_to_z_score( x, elementwise=False, name=None, output_dtype=None )
Returns a standardized column with mean 0 and variance 1.
Scaling to z-score subtracts out the mean and divides by standard deviation. Note that the standard deviation computed here is based on the biased variance (0 delta degrees of freedom), as computed by analyzers.var.
x: A numeric
elementwise: If true, scales each element of the tensor independently; otherwise uses the mean and variance of the whole tensor.
name: (Optional) A name for this operation.
output_dtype: (Optional) If not None, casts the output tensor to this type.
SparseTensor containing the input column scaled to mean 0
and variance 1 (standard deviation 1), given by: (x - mean(x)) / std_dev(x).
x is floating point, the mean will have the same type as
integral, the output is cast to tf.float32.
Note that TFLearn generally permits only tf.int64 and tf.float32, so casting this scaler's output may be necessary.