{ }
Update '*var' according to the adagrad scheme.
tf.raw_ops.ResourceApplyAdagradV2(
var,
accum,
lr,
epsilon,
grad,
use_locking=False,
update_slots=True,
name=None
)
accum += grad * grad var -= lr * grad * (1 / (sqrt(accum) + epsilon))
Returns | |
---|---|
The created Operation. |