ResourceApplyAdamWithAmsgrad

public final class ResourceApplyAdamWithAmsgrad

Update '*var' according to the Adam algorithm.

$$\text{lr}_t := \mathrm{learning_rate} * \sqrt{1 - \beta_2^t} / (1 - \beta_1^t)$$
$$m_t := \beta_1 * m_{t-1} + (1 - \beta_1) * g$$
$$v_t := \beta_2 * v_{t-1} + (1 - \beta_2) * g * g$$
$$\hat{v}_t := max{\hat{v}_{t-1}, v_t}$$
$$\text{variable} := \text{variable} - \text{lr}_t * m_t / (\sqrt{\hat{v}_t} + \epsilon)$$

Nested Classes

class ResourceApplyAdamWithAmsgrad.Options Optional attributes for ResourceApplyAdamWithAmsgrad

Constants

String OP_NAME The name of this op, as known by TensorFlow core engine

Public Methods

static <T extends TType > ResourceApplyAdamWithAmsgrad
create ( Scope scope, Operand <?> var, Operand <?> m, Operand <?> v, Operand <?> vhat, Operand <T> beta1Power, Operand <T> beta2Power, Operand <T> lr, Operand <T> beta1, Operand <T> beta2, Operand <T> epsilon, Operand <T> grad, Options... options)
Factory method to create a class wrapping a new ResourceApplyAdamWithAmsgrad operation.
static ResourceApplyAdamWithAmsgrad.Options
useLocking (Boolean useLocking)

Inherited Methods

Constants

public static final String OP_NAME

The name of this op, as known by TensorFlow core engine

Constant Value: "ResourceApplyAdamWithAmsgrad"

Public Methods

public static ResourceApplyAdamWithAmsgrad create ( Scope scope, Operand <?> var, Operand <?> m, Operand <?> v, Operand <?> vhat, Operand <T> beta1Power, Operand <T> beta2Power, Operand <T> lr, Operand <T> beta1, Operand <T> beta2, Operand <T> epsilon, Operand <T> grad, Options... options)

Factory method to create a class wrapping a new ResourceApplyAdamWithAmsgrad operation.

Parameters
scope current scope
var Should be from a Variable().
m Should be from a Variable().
v Should be from a Variable().
vhat Should be from a Variable().
beta1Power Must be a scalar.
beta2Power Must be a scalar.
lr Scaling factor. Must be a scalar.
beta1 Momentum factor. Must be a scalar.
beta2 Momentum factor. Must be a scalar.
epsilon Ridge term. Must be a scalar.
grad The gradient.
options carries optional attributes values
Returns
  • a new instance of ResourceApplyAdamWithAmsgrad

public static ResourceApplyAdamWithAmsgrad.Options useLocking (Boolean useLocking)

Parameters
useLocking If `True`, updating of the var, m, and v tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention.