@frozen
public struct BatchNorm<Scalar> : Layer where Scalar : TensorFlowFloatingPoint
A batch normalization layer.
Normalizes the activations of the previous layer at each batch, i.e. applies a transformation
that maintains the mean activation close to 0
and the activation standard deviation close to
1
.
Reference: Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift.
-
The feature dimension.
Declaration
@noDerivative public let axis: Int
-
The momentum for the running mean and running variance.
Declaration
@noDerivative public let momentum: Scalar
-
The offset value, also known as beta.
Declaration
public var offset: Tensor<Scalar>
-
The scale value, also known as gamma.
Declaration
public var scale: Tensor<Scalar>
-
The variance epsilon value.
Declaration
@noDerivative public let epsilon: Scalar
-
The running mean.
Declaration
@noDerivative public var runningMean: Parameter<Scalar>
-
The running variance.
Declaration
@noDerivative public var runningVariance: Parameter<Scalar>
-
Creates a batch normalization layer.
Declaration
Parameters
axis
The axis that should not be normalized (typically the feature axis).
momentum
The momentum for the moving average.
offset
The offset to be added to the normalized tensor.
scale
The scale to multiply the normalized tensor by.
epsilon
A small scalar added to the denominator to improve numerical stability.
runningMean
The running mean.
runningVariance
The running variance.
-
Creates a batch normalization layer.
Declaration
public init( featureCount: Int, axis: Int = -1, momentum: Scalar = 0.99, epsilon: Scalar = 0.001 )
Parameters
featureCount
The number of features.
axis
The axis that should be normalized (typically the features axis).
momentum
The momentum for the moving average.
epsilon
A small scalar added to the denominator to improve numerical stability.