{ }
Update '*var' according to the adagrad scheme.
tf.raw_ops.ResourceApplyAdagrad(
var, accum, lr, grad, use_locking=False, update_slots=True, name=None
)
accum += grad * grad var -= lr * grad * (1 / sqrt(accum))
Returns | |
---|---|
The created Operation. |