tf.contrib.opt.clip_gradients_by_global_norm

View source on GitHub

Clips gradients of a multitask loss by their global norm.

Ignores all-zero tensors when computing the global norm.

gradients_variables a list of pairs (gradient, variable).
clip_norm a float Tensor, the global norm to clip on. Default is 20.0.

list A list of pairs of the same type as gradients_variables,.
fixed_global_norm A 0-D (scalar) Tensor representing the global norm.