|TensorFlow 1 version||View source on GitHub|
Clips values of multiple tensors by the ratio of the sum of their norms.
Compat aliases for migration
See Migration guide for more details.
tf.clip_by_global_norm( t_list, clip_norm, use_norm=None, name=None )
Given a tuple or list of tensors
t_list, and a clipping ratio
this operation returns a list of clipped tensors
and the global norm (
global_norm) of all tensors in
if you've already computed the global norm for
t_list, you can specify
the global norm with
To perform the clipping, the values
t_list[i] are set to:
t_list[i] * clip_norm / max(global_norm, clip_norm)
global_norm = sqrt(sum([l2norm(t)**2 for t in t_list]))
clip_norm > global_norm then the entries in
t_list remain as they are,
otherwise they're all shrunk by the global ratio.
global_norm == infinity then the entries in
t_list are all set to
to signal that an error occurred.
Any of the entries of
t_list that are of type
None are ignored.
This is the correct way to perform gradient clipping (Pascanu et al., 2012).
However, it is slower than
clip_by_norm() because all the parameters must be
ready before the clipping operation can be performed.