An embedding layer.
Embedding is effectively a lookup table that maps indices from a fixed vocabulary to fixed-size
(dense) vector representations, e.g.
[, ] -> [[0.25, 0.1], [0.6, -0.2]].
A learnable lookup table that maps vocabulary indices to their dense vector representations.
public var embeddings: Tensor<Scalar>
Embeddinglayer with randomly initialized embeddings of shape
(vocabularySize, embeddingSize)so that each vocabulary index is given a vector representation.
The number of distinct indices (words) in the vocabulary. This number should be the largest integer index plus one.
The number of entries in a single embedding vector representation.
Initializer to use for the embedding parameters.
Embeddinglayer from the provided embeddings. Useful for introducing pretrained embeddings into a model.
public init(embeddings: Tensor<Scalar>)
The pretrained embeddings table.