|View source on GitHub|
A conditional accumulator for aggregating gradients.
tf.compat.v1.ConditionalAccumulator( dtype, shape=None, shared_name=None, name='conditional_accumulator', reduction_type='MEAN' )
Up-to-date gradients (i.e., time step at which gradient was computed is equal to the accumulator's time step) are added to the accumulator.
Extraction of the average gradient is blocked until the required number of gradients has been accumulated.
||Datatype of the accumulated gradients.|
||Shape of the accumulated gradients.|
||Optional. If non-empty, this accumulator will be shared under the given name across multiple sessions.|
||Optional name for the accumulator.|