Layer

public protocol Layer : Module where Self.Input : Differentiable

A neural network layer.

Types that conform to Layer represent functions that map inputs to outputs. They may have an internal state represented by parameters, such as weight tensors.

Layer instances define a differentiable callAsFunction(_:) method for mapping inputs to outputs.

  • Returns the output obtained from applying the layer to the given input.

    Declaration

    @differentiable
    func callAsFunction(_ input: Input) -> Output

    Parameters

    input

    The input to the layer.

    Return Value

    The output.

  • forward(_:)

    Default implementation

    Default Implementation

    Declaration

    @differentiable
    func forward(_ input: Input) -> Output
  • inferring(from:)

    Extension method

    Returns the inference output obtained from applying the layer to the given input.

    Declaration

    public func inferring(from input: Input) -> Output

    Parameters

    input

    The input to the layer.

    Return Value

    The inference output.

  • Backpropagator

    Extension method

    Declaration

    public typealias Backpropagator = (_ direction: Output.TangentVector)
      -> (layerGradient: TangentVector, inputGradient: Input.TangentVector)
  • Returns the inference output and the backpropagation function obtained from applying the layer to the given input.

    Declaration

    public func appliedForBackpropagation(to input: Input)
      -> (output: Output, backpropagator: Backpropagator)

    Parameters

    input

    The input to the layer.

    Return Value

    A tuple containing the output and the backpropagation function. The backpropagation function (a.k.a. backpropagator) takes a direction vector and returns the gradients at the layer and at the input, respectively.

  • callAsFunction(_:)

    Default implementation

    Default Implementation

    Declaration

    @differentiable(wrt: self)
    @differentiable
    public func callAsFunction(_ input: Input) -> Output