{ }
Update entries in 'var' and 'accum' according to the proximal adagrad scheme.
tf.raw_ops.ResourceSparseApplyAdagradDA(
var,
gradient_accumulator,
gradient_squared_accumulator,
grad,
indices,
lr,
l1,
l2,
global_step,
use_locking=False,
name=None
)
Returns | |
---|---|
The created Operation. |