|View source on GitHub|
_EmbeddingColumn for feeding sparse data into a DNN.
tf.contrib.layers.embedding_column( sparse_id_column, dimension, combiner='mean', initializer=None, ckpt_to_load_from=None, tensor_name_in_ckpt=None, max_norm=None, trainable=True )
_SparseColumnwhich is created by for example
sparse_column_with_*or crossed_column functions. Note that
dimension: An integer specifying dimension of the embedding.
combiner: A string specifying how to reduce if there are multiple entries in a single row. Currently "mean", "sqrtn" and "sum" are supported, with "mean" the default. "sqrtn" often achieves good accuracy, in particular with bag-of-words columns. Each of this can be thought as example level normalizations on the column:
- "sum": do not normalize
- "mean": do l1 normalization
- "sqrtn": do l2 normalization
For more information:
initializer: A variable initializer function to be used in embedding variable initialization. If not specified, defaults to
tf.compat.v1.truncated_normal_initializerwith mean 0.0 and standard deviation 1/sqrt(sparse_id_column.length).
ckpt_to_load_from: (Optional). String representing checkpoint name/pattern to restore the column weights. Required if
tensor_name_in_ckptis not None.
tensor_name_in_ckpt: (Optional). Name of the
Tensorin the provided checkpoint from which to restore the column weights. Required if
ckpt_to_load_fromis not None.
max_norm: (Optional). If not None, embedding values are l2-normalized to the value of max_norm.
trainable: (Optional). Should the embedding be trainable. Default is True