Embedding

public struct Embedding<Scalar> : Module where Scalar : TensorFlowFloatingPoint

An embedding layer.

Embedding is effectively a lookup table that maps indices from a fixed vocabulary to fixed-size (dense) vector representations, e.g. [[0], [3]] -> [[0.25, 0.1], [0.6, -0.2]].

  • A learnable lookup table that maps vocabulary indices to their dense vector representations.

    Declaration

    public var embeddings: Tensor<Scalar>
  • Creates an Embedding layer with randomly initialized embeddings of shape (vocabularySize, embeddingSize) so that each vocabulary index is given a vector representation.

    Declaration

    public init(
      vocabularySize: Int,
      embeddingSize: Int,
      embeddingsInitializer: ParameterInitializer<Scalar> = { Tensor(randomUniform: $0) }
    )

    Parameters

    vocabularySize

    The number of distinct indices (words) in the vocabulary. This number should be the largest integer index plus one.

    embeddingSize

    The number of entries in a single embedding vector representation.

    embeddingsInitializer

    Initializer to use for the embedding parameters.

  • Creates an Embedding layer from the provided embeddings. Useful for introducing pretrained embeddings into a model.

    Declaration

    public init(embeddings: Tensor<Scalar>)

    Parameters

    embeddings

    The pretrained embeddings table.

  • Returns an output by replacing each index in the input with corresponding dense vector representation.

    Declaration

    @differentiable(wrt: self)
    public func forward(_ input: Tensor<Int32>) -> Tensor<Scalar>

    Return Value

    The tensor created by replacing input indices with their vector representations.