|View source on GitHub|
Looks up embeddings for the given
ids from a list of tensors.
tf.compat.v1.nn.embedding_lookup( params, ids, partition_strategy='mod', name=None, validate_indices=True, max_norm=None )
This function is used to perform parallel lookups on the list of tensors in
params. It is a generalization of
interpreted as a partitioning of a large embedding tensor.
params may be
PartitionedVariable as returned by using
with a partitioner.
len(params) > 1, each element
ids is partitioned between
the elements of
params according to the
In all strategies, if the id space does not evenly divide the number of
partitions, each of the first
(max_id + 1) % len(params) partitions will
be assigned one more id.
"mod", we assign each id to partition
p = id % len(params). For instance,
13 ids are split across 5 partitions as:
[[0, 5, 10], [1, 6, 11], [2, 7, 12], [3, 8], [4, 9]]
"div", we assign ids to partitions in a
contiguous manner. In this case, 13 ids are split across 5 partitions as:
[[0, 1, 2], [3, 4, 5], [6, 7, 8], [9, 10], [11, 12]]
If the input ids are ragged tensors, partition variables are not supported and
the partition strategy and the max_norm are ignored.
The results of the lookup are concatenated into a dense
tensor. The returned tensor has shape
shape(ids) + shape(params)[1:].
A single tensor representing the complete embedding tensor, or a
list of P tensors all of same shape except for the first dimension,
representing sharded embedding tensors. Alternatively, a
A string specifying the partitioning strategy, relevant
||A name for the operation (optional).|
DEPRECATED. If this operation is assigned to CPU, values