public struct BatchNorm<Scalar> : Layer where Scalar : TensorFlowFloatingPoint

A batch normalization layer.

Normalizes the activations of the previous layer at each batch, i.e. applies a transformation that maintains the mean activation close to 0 and the activation standard deviation close to 1.

Reference: Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift.

  • The feature dimension.

    Declaration

    public let axis: Int
  • The momentum for the running mean and running variance.

    Declaration

    public let momentum: Tensor<Scalar>
  • The offset value, also known as beta.

    Declaration

    public var offset: Tensor<Scalar>
  • The scale value, also known as gamma.

    Declaration

    public var scale: Tensor<Scalar>
  • The variance epsilon value.

    Declaration

    public let epsilon: Tensor<Scalar>
  • The running mean.

    Declaration

    public let runningMean: Parameter<Scalar>
  • The running variance.

    Declaration

    public let runningVariance: Parameter<Scalar>
  • Creates a batch normalization layer.

    Declaration

    public init(
        axis: Int,
        momentum: Tensor<Scalar>,
        offset: Tensor<Scalar>,
        scale: Tensor<Scalar>,
        epsilon: Tensor<Scalar>,
        runningMean: Tensor<Scalar>,
        runningVariance: Tensor<Scalar>
    )

    Parameters

    axis

    The axis that should not be normalized (typically the feature axis).

    momentum

    The momentum for the moving average.

    offset

    The offset to be added to the normalized tensor.

    scale

    The scale to multiply the normalized tensor by.

    epsilon

    A small scalar added to the denominator to improve numerical stability.

    runningMean

    The running mean.

    runningVariance

    The running variance.

  • Returns the output obtained from applying the layer to the given input.

    Declaration

    public func call(_ input: Tensor<Scalar>) -> Tensor<Scalar>

    Parameters

    input

    The input to the layer.

    Return Value

    The output.

  • Creates a batch normalization layer.

    Declaration

    public init(featureCount: Int,
                axis: Int = -1,
                momentum: Tensor<Scalar> = Tensor(0.99),
                epsilon: Tensor<Scalar> = Tensor(0.001))

    Parameters

    featureCount

    The number of features.

    axis

    The axis that should be normalized (typically the features axis).

    momentum

    The momentum for the moving average.

    epsilon

    A small scalar added to the denominator to improve numerical stability.