@frozen
public struct LayerNorm<Scalar> : Layer where Scalar : TensorFlowFloatingPoint

A layer that applies layer normalization over a mini-batch of inputs.

Reference: Layer Normalization.

  • The offset value, also known as beta.

    Declaration

    public var offset: Tensor<Scalar>
  • The scale value, also known as gamma.

    Declaration

    public var scale: Tensor<Scalar>
  • The axis.

    Declaration

    @noDerivative
    public let axis: Int
  • The variance epsilon value.

    Declaration

    @noDerivative
    public let epsilon: Tensor<Scalar>
  • Creates a layer normalization layer.

    Declaration

    public init(
        offset: Tensor<Scalar>,
        scale: Tensor<Scalar>,
        axis: Int,
        epsilon: Tensor<Scalar>
    )
  • Creates a layer normalization layer.

    Declaration

    public init(
        featureCount: Int,
        axis: Int,
        epsilon: Tensor<Scalar> = Tensor(0.001)
    )

    Parameters

    featureCount

    The number of features.

    axis

    The axis that should be normalized.

    epsilon

    The small scalar added to variance.

  • Returns the output obtained from applying the layer to the given input.

    Declaration

    @differentiable
    public func callAsFunction(_ input: Tensor<Scalar>) -> Tensor<Scalar>

    Parameters

    input

    The input to the layer.

    Return Value

    The output.