Have a question? Connect with the community at the TensorFlow Forum Visit Forum

RetrieveTPUEmbeddingADAMParametersGradAccumDebug

public final class RetrieveTPUEmbeddingADAMParametersGradAccumDebug

Retrieve ADAM embedding parameters with debug support.

An op that retrieves optimization parameters from embedding to host memory. Must be preceded by a ConfigureTPUEmbeddingHost op that sets up the correct embedding table configuration. For example, this op is used to retrieve updated parameters before saving a checkpoint.

Nested Classes

class RetrieveTPUEmbeddingADAMParametersGradAccumDebug.Options Optional attributes for RetrieveTPUEmbeddingADAMParametersGradAccumDebug

Constants

String OP_NAME The name of this op, as known by TensorFlow core engine

Public Methods

static RetrieveTPUEmbeddingADAMParametersGradAccumDebug.Options
config (String config)
static RetrieveTPUEmbeddingADAMParametersGradAccumDebug
create ( Scope scope, Long numShards, Long shardId, Options... options)
Factory method to create a class wrapping a new RetrieveTPUEmbeddingADAMParametersGradAccumDebug operation.
Output < TFloat32 >
gradientAccumulators ()
Parameter gradient_accumulators updated by the ADAM optimization algorithm.
Output < TFloat32 >
momenta ()
Parameter momenta updated by the ADAM optimization algorithm.
Output < TFloat32 >
parameters ()
Parameter parameters updated by the ADAM optimization algorithm.
static RetrieveTPUEmbeddingADAMParametersGradAccumDebug.Options
tableId (Long tableId)
static RetrieveTPUEmbeddingADAMParametersGradAccumDebug.Options
tableName (String tableName)
Output < TFloat32 >
velocities ()
Parameter velocities updated by the ADAM optimization algorithm.

Inherited Methods

Constants

public static final String OP_NAME

The name of this op, as known by TensorFlow core engine

Constant Value: "RetrieveTPUEmbeddingADAMParametersGradAccumDebug"

Public Methods

public static RetrieveTPUEmbeddingADAMParametersGradAccumDebug create ( Scope scope, Long numShards, Long shardId, Options... options)

Factory method to create a class wrapping a new RetrieveTPUEmbeddingADAMParametersGradAccumDebug operation.

Parameters
scope current scope
options carries optional attributes values
Returns
  • a new instance of RetrieveTPUEmbeddingADAMParametersGradAccumDebug

public Output < TFloat32 > gradientAccumulators ()

Parameter gradient_accumulators updated by the ADAM optimization algorithm.

public Output < TFloat32 > momenta ()

Parameter momenta updated by the ADAM optimization algorithm.

public Output < TFloat32 > parameters ()

Parameter parameters updated by the ADAM optimization algorithm.

public static RetrieveTPUEmbeddingADAMParametersGradAccumDebug.Options tableName (String tableName)

public Output < TFloat32 > velocities ()

Parameter velocities updated by the ADAM optimization algorithm.