tfm.nlp.layers.RelativePositionEmbedding

Creates a positional embedding.

This layer calculates the position encoding as a mix of sine and cosine functions with geometrically increasing wavelengths. Defined and formulized in "Attention is All You Need", section 3.5. (https://arxiv.org/abs/1706.03762).

hidden_size Size of the hidden layer.
min_timescale Minimum scale that will be applied at each position
max_timescale Maximum scale that will be applied at each position.

Methods

call

View source

Implements call() for the layer.

Args
inputs An tensor whose second dimension will be used as length. If None, the other length argument must be specified.
length An optional integer specifying the number of positions. If both inputs and length are spcified, length must be equal to the second dimension of inputs.

Returns
A tensor in shape of (length, hidden_size).