{ }
Update '*var' according to the adagrad scheme.
tf.raw_ops.ApplyAdagrad(
var, accum, lr, grad, use_locking=False, update_slots=True, name=None
)
accum += grad * grad var -= lr * grad * (1 / sqrt(accum))
Returns | |
---|---|
A mutable Tensor . Has the same type as var .
|