ResourceApplyAdagrad

public final class ResourceApplyAdagrad

Update '*var' according to the adagrad scheme.

accum += grad * grad var -= lr * grad * (1 / (sqrt(accum) + epsilon))

Nested Classes

class ResourceApplyAdagrad.Options Optional attributes for ResourceApplyAdagrad  

Constants

String OP_NAME The name of this op, as known by TensorFlow core engine

Public Methods

static <T extends TType> ResourceApplyAdagrad
create(Scope scope, Operand<?> var, Operand<?> accum, Operand<T> lr, Operand<T> epsilon, Operand<T> grad, Options... options)
Factory method to create a class wrapping a new ResourceApplyAdagrad operation.
static ResourceApplyAdagrad.Options
updateSlots(Boolean updateSlots)
static ResourceApplyAdagrad.Options
useLocking(Boolean useLocking)

Inherited Methods

Constants

public static final String OP_NAME

The name of this op, as known by TensorFlow core engine

Constant Value: "ResourceApplyAdagradV2"

Public Methods

public static ResourceApplyAdagrad create (Scope scope, Operand<?> var, Operand<?> accum, Operand<T> lr, Operand<T> epsilon, Operand<T> grad, Options... options)

Factory method to create a class wrapping a new ResourceApplyAdagrad operation.

Parameters
scope current scope
var Should be from a Variable().
accum Should be from a Variable().
lr Scaling factor. Must be a scalar.
epsilon Constant factor. Must be a scalar.
grad The gradient.
options carries optional attributes values
Returns
  • a new instance of ResourceApplyAdagrad

public static ResourceApplyAdagrad.Options updateSlots (Boolean updateSlots)

public static ResourceApplyAdagrad.Options useLocking (Boolean useLocking)

Parameters
useLocking If `True`, updating of the var and accum tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention.