tf.compat.v1.layers.BatchNormalization

Batch Normalization layer from (Ioffe et al., 2015).

Inherits From: BatchNormalization, Layer, Layer, Module

Keras APIs handle BatchNormalization updates to the moving_mean and moving_variance as part of their fit() and evaluate() loops. However, if a custom training loop is used with an instance of Model, these updates need to be explicitly included. Here's a simple example of how it can be done:

  # model is an instance of Model that contains BatchNormalization layer.
  update_ops = model.get_updates_for(None) + model.get_updates_for(features)
  train_op = optimizer.minimize(loss)
  train_op = tf.group([train_op, update_ops])

axis An int or list of int, the axis or axes that should be normalized, typically the features axis/axes. For instance, after a Conv2D layer with