public protocol Layer : Module where Self.Input : Differentiable

A neural network layer.

Types that conform to Layer represent functions that map inputs to outputs. They may have an internal state represented by parameters, such as weight tensors.

Layer instances define a differentiable callAsFunction(_:) method for mapping inputs to outputs.

  • Returns the output obtained from applying the layer to the given input.

    Declaration

    @differentiable
    func callAsFunction(_ input: Input) -> Output

    Parameters

    input

    The input to the layer.

    Return Value

    The output.

  • call(_:)

    Extension method

    Declaration

    @differentiable
    func call(_ input: Input) -> Output
  • inferring(from:)

    Extension method

    Returns the inference output obtained from applying the layer to the given input.

    Declaration

    @differentiable
    func inferring(from input: Input) -> Output

    Parameters

    input

    The input to the layer.

    Return Value

    The inference output.

  • Backpropagator

    Extension method

    Declaration

    typealias Backpropagator = (_ direction: Output.TangentVector) -> (layerGradient: TangentVector, inputGradient: Input.TangentVector)
  • Returns the inference output and the backpropagation function obtained from applying the layer to the given input.

    Declaration

    func appliedForBackpropagation(to input: Input)
        -> (output: Output, backpropagator: Backpropagator)

    Parameters

    input

    The input to the layer.

    Return Value

    A tuple containing the output and the backpropagation function. The backpropagation function (a.k.a. backpropagator) takes a direction vector and returns the gradients at the layer and at the input, respectively.