{ }
Update '*var' according to the adagrad scheme.
tf.raw_ops.ApplyAdagradV2(
var,
accum,
lr,
epsilon,
grad,
use_locking=False,
update_slots=True,
name=None
)
accum += grad * grad var -= lr * grad * (1 / sqrt(accum))
Returns | |
---|---|
A mutable Tensor . Has the same type as var .
|