tf.compat.v1.estimator.tpu.experimental.EmbeddingConfigSpec

Class to keep track of the specification for TPU embeddings.

Pass this class to tf.estimator.tpu.TPUEstimator via the embedding_config_spec parameter. At minimum you need to specify feature_columns and optimization_parameters. The feature columns passed should be created with some combination of tf.tpu.experimental.embedding_column and tf.tpu.experimental.shared_embedding_columns.

TPU embeddings do not support arbitrary Tensorflow optimizers and the main optimizer you use for your model will be ignored for the embedding table variables. Instead TPU embeddigns support a fixed set of predefined optimizers that you can select from and set the parameters of. These include adagrad, adam and stochastic gradient descent. Each supported optimizer has a Parameters class in the tf.tpu.experimental namespace.

column_a = tf.feature_column.categorical_column_with_identity(...)
column_b = tf.feature_column.categorical_column_with_identity(...)
column_c = tf.feature_column.categorical_column_with_identity(...)
tpu_shared_columns = tf.tpu.experimental.shared_embedding_columns(
    [column_a, column_b], 10)
tpu_non_shared_column = tf.tpu.experimental.embedding_column(
    column_c, 10)
tpu_columns = [tpu_non_shared_column] + tpu_shared_columns
...
def model_fn(features):
  dense_features = tf.keras.layers.DenseFeature(tpu_columns)
  embedded_feature = dense_features(features)
  ...

estimator = tf.estimator.tpu.TPUEstimator(
    model_fn=model_fn,
    ...
    embedding_config_spec=tf.estimator.tpu.experimental.EmbeddingConfigSpec(
        col