Stay organized with collections Save and categorize content based on your preferences.

DP subclass of tf.keras.Sequential.

This can be used as a differentially private replacement for tf.keras.Sequential. This class implements DP-SGD using the standard Gaussian mechanism.

When instantiating this class, you need to supply several DP-related arguments followed by the standard arguments for Sequential.


# Create Model instance.
model = DPSequential(l2_norm_clip=1.0, noise_multiplier=0.5, use_xla=True,
         <standard arguments>)

You should use your DPSequential instance with a standard instance of tf.keras.Optimizer as the optimizer, and a standard reduced loss. You do not need to use a differentially private optimizer.

# Use a standard (non-DP) optimizer.
optimizer = tf.keras.optimizers.SGD(learning_rate=0.01)

# Use a standard reduced loss.
loss = tf.keras.losses.MeanSquaredError()

model.compile(optimizer=optimizer, loss=loss), train_labels, epochs=1, batch_size=32)

l2_norm_clip Clipping norm (max L2 norm of per microbatch gradients).
noise_multiplier Ratio of the standard deviation to the clipping norm.
num_microbatches Number of microbatches.
use_xla If True, compiles train_step to XLA.
*args These will be passed on to the base class __init__ method.
**kwargs These will be passed on to the base class __init__ method.



Adds a layer instance on top of the layer stack.

layer layer instance.

TypeError If layer is not a layer instance.
ValueError In case the layer argument does not know its input shape.
ValueError In case the layer argument has multiple output tensors, or is already connected somewhere else (forbidden in Sequential models).


Removes the last layer in the model.

TypeError if there are no layers in the model.