Assess privacy risks with the TensorFlow Privacy Report

Stay organized with collections Save and categorize content based on your preferences.

View on TensorFlow.org Run in Google Colab View source on GitHub Download notebook

Overview

In this codelab you'll train a simple image classification model on the CIFAR10 dataset, and then use the "membership inference attack" against this model to assess if the attacker is able to "guess" whether a particular sample was present in the training set. You will use the TF Privacy Report to visualize results from multiple models and model checkpoints.

Setup

import numpy as np
from typing import Tuple
from scipy import special
from sklearn import metrics

import tensorflow as tf

import tensorflow_datasets as tfds

# Set verbosity.
tf.compat.v1.logging.set_verbosity(tf.compat.v1.logging.ERROR)
from sklearn.exceptions import ConvergenceWarning

import warnings
warnings.simplefilter(action="ignore", category=ConvergenceWarning)
warnings.simplefilter(action="ignore", category=FutureWarning)
2022-09-01 09:08:59.812178: E tensorflow/stream_executor/cuda/cuda_blas.cc:2981] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2022-09-01 09:09:00.579545: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvrtc.so.11.1: cannot open shared object file: No such file or directory
2022-09-01 09:09:00.579953: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer_plugin.so.7'; dlerror: libnvrtc.so.11.1: cannot open shared object file: No such file or directory
2022-09-01 09:09:00.579968: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Cannot dlopen some TensorRT libraries. If you would like to use Nvidia GPU with TensorRT, please make sure the missing libraries mentioned above are installed properly.

Install TensorFlow Privacy.

pip install tensorflow_privacy
from tensorflow_privacy.privacy.privacy_tests.membership_inference_attack import membership_inference_attack as mia
from tensorflow_privacy.privacy.privacy_tests.membership_inference_attack.data_structures import AttackInputData
from tensorflow_privacy.privacy.privacy_tests.membership_inference_attack.data_structures import AttackResultsCollection
from tensorflow_privacy.privacy.privacy_tests.membership_inference_attack.data_structures import AttackType
from tensorflow_privacy.privacy.privacy_tests.membership_inference_attack.data_structures import PrivacyMetric
from tensorflow_privacy.privacy.privacy_tests.membership_inference_attack.data_structures import PrivacyReportMetadata
from tensorflow_privacy.privacy.privacy_tests.membership_inference_attack.data_structures import SlicingSpec
from tensorflow_privacy.privacy.privacy_tests.membership_inference_attack import privacy_report
import tensorflow_privacy

Train two models, with privacy metrics

This section trains a pair of keras.Model classifiers on the CIFAR-10 dataset. During the training process it collects privacy metrics, that will be used to generate reports in the bext section.

The first step is to define some hyperparameters:

dataset = 'cifar10'
num_classes = 10
activation = 'relu'
num_conv = 3

batch_size=50
epochs_per_report = 2
total_epochs = 50

lr = 0.001

Next, load the dataset. There's nothing privacy-specific in this code.

Loading the dataset.

Next define a function to build the models.

Build two three-layer CNN models using that function.

Configure the first to use a basic SGD optimizer, an the second to use a differentially private optimizer (tf_privacy.DPKerasAdamOptimizer), so you can compare the results.

model_2layers = small_cnn(
    input_shape, num_classes, num_conv=2, activation=activation)
model_3layers = small_cnn(
    input_shape, num_classes, num_conv=3, activation=activation)

Define a callback to collect privacy metrics

Next define a keras.callbacks.Callback to periorically run some privacy attacks against the model, and log the results.

The keras fit method will call the on_epoch_end method after each training epoch. The n argument is the (0-based) epoch number.

You could implement this procedure by writing a loop that repeatedly calls Model.fit(..., epochs=epochs_per_report) and runs the attack code. The callback is used here just because it gives a clear separation between the training logic, and the privacy evaluation logic.

class PrivacyMetrics(tf.keras.callbacks.Callback):
  def __init__(self, epochs_per_report, model_name):
    self.epochs_per_report = epochs_per_report
    self.model_name = model_name
    self.attack_results = []

  def on_epoch_end(self, epoch, logs=None):
    epoch = epoch+1

    if epoch % self.epochs_per_report != 0:
      return

    print(f'\nRunning privacy report for epoch: {epoch}\n')

    logits_train = self.model.predict(x_train, batch_size=batch_size)
    logits_test = self.model.predict(x_test, batch_size=batch_size)

    prob_train = special.softmax(logits_train, axis=1)
    prob_test = special.softmax(logits_test, axis=1)

    # Add metadata to generate a privacy report.
    privacy_report_metadata = PrivacyReportMetadata(
        # Show the validation accuracy on the plot
        # It's what you send to train_accuracy that gets plotted.
        accuracy_train=logs['val_accuracy'], 
        accuracy_test=logs['val_accuracy'],
        epoch_num=epoch,
        model_variant_label=self.model_name)

    attack_results = mia.run_attacks(
        AttackInputData(
            labels_train=y_train_indices[:, 0],
            labels_test=y_test_indices[:, 0],
            probs_train=prob_train,
            probs_test=prob_test),
        SlicingSpec(entire_dataset=True, by_class=True),
        attack_types=(AttackType.THRESHOLD_ATTACK,
                      AttackType.LOGISTIC_REGRESSION),
        privacy_report_metadata=privacy_report_metadata)

    self.attack_results.append(attack_results)

Train the models

The next code block trains the two models. The all_reports list is used to collect all the results from all the models' training runs. The individual reports are tagged witht the model_name, so there's no confusion about which model generated which report.

all_reports = []
callback = PrivacyMetrics(epochs_per_report, "2 Layers")
history = model_2layers.fit(
      x_train,
      y_train,
      batch_size=batch_size,
      epochs=total_epochs,
      validation_data=(x_test, y_test),
      callbacks=[callback],
      shuffle=True)

all_reports.extend(callback.attack_results)
Epoch 1/50
1000/1000 [==============================] - 10s 4ms/step - loss: 1.5074 - accuracy: 0.4593 - val_loss: 1.3571 - val_accuracy: 0.5163
Epoch 2/50
 992/1000 [============================>.] - ETA: 0s - loss: 1.1817 - accuracy: 0.5834
Running privacy report for epoch: 2

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 1.1818 - accuracy: 0.5832 - val_loss: 1.1325 - val_accuracy: 0.5990
Epoch 3/50
1000/1000 [==============================] - 4s 4ms/step - loss: 1.0626 - accuracy: 0.6288 - val_loss: 1.0435 - val_accuracy: 0.6371
Epoch 4/50
 991/1000 [============================>.] - ETA: 0s - loss: 0.9850 - accuracy: 0.6558
Running privacy report for epoch: 4

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.9851 - accuracy: 0.6555 - val_loss: 1.0298 - val_accuracy: 0.6453
Epoch 5/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.9252 - accuracy: 0.6787 - val_loss: 0.9810 - val_accuracy: 0.6648
Epoch 6/50
 992/1000 [============================>.] - ETA: 0s - loss: 0.8733 - accuracy: 0.6942
Running privacy report for epoch: 6

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 15s 15ms/step - loss: 0.8730 - accuracy: 0.6944 - val_loss: 0.9655 - val_accuracy: 0.6648
Epoch 7/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.8272 - accuracy: 0.7102 - val_loss: 0.9678 - val_accuracy: 0.6633
Epoch 8/50
 992/1000 [============================>.] - ETA: 0s - loss: 0.7900 - accuracy: 0.7221
Running privacy report for epoch: 8

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 15s 15ms/step - loss: 0.7902 - accuracy: 0.7221 - val_loss: 0.9468 - val_accuracy: 0.6778
Epoch 9/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.7549 - accuracy: 0.7366 - val_loss: 0.9706 - val_accuracy: 0.6733
Epoch 10/50
 990/1000 [============================>.] - ETA: 0s - loss: 0.7231 - accuracy: 0.7485
Running privacy report for epoch: 10

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.7229 - accuracy: 0.7487 - val_loss: 0.9114 - val_accuracy: 0.6917
Epoch 11/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.6914 - accuracy: 0.7584 - val_loss: 0.9332 - val_accuracy: 0.6881
Epoch 12/50
 990/1000 [============================>.] - ETA: 0s - loss: 0.6598 - accuracy: 0.7686
Running privacy report for epoch: 12

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.6608 - accuracy: 0.7683 - val_loss: 0.9254 - val_accuracy: 0.6961
Epoch 13/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.6379 - accuracy: 0.7754 - val_loss: 0.9576 - val_accuracy: 0.6885
Epoch 14/50
 991/1000 [============================>.] - ETA: 0s - loss: 0.6126 - accuracy: 0.7854
Running privacy report for epoch: 14

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.6125 - accuracy: 0.7855 - val_loss: 1.0004 - val_accuracy: 0.6767
Epoch 15/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.5936 - accuracy: 0.7927 - val_loss: 0.9481 - val_accuracy: 0.6918
Epoch 16/50
 993/1000 [============================>.] - ETA: 0s - loss: 0.5681 - accuracy: 0.7986
Running privacy report for epoch: 16

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.5688 - accuracy: 0.7983 - val_loss: 1.0323 - val_accuracy: 0.6696
Epoch 17/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.5518 - accuracy: 0.8046 - val_loss: 1.0260 - val_accuracy: 0.6837
Epoch 18/50
 991/1000 [============================>.] - ETA: 0s - loss: 0.5274 - accuracy: 0.8144
Running privacy report for epoch: 18

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.5285 - accuracy: 0.8142 - val_loss: 1.0618 - val_accuracy: 0.6772
Epoch 19/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.5078 - accuracy: 0.8200 - val_loss: 1.0594 - val_accuracy: 0.6842
Epoch 20/50
 990/1000 [============================>.] - ETA: 0s - loss: 0.4911 - accuracy: 0.8268
Running privacy report for epoch: 20

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.4905 - accuracy: 0.8270 - val_loss: 1.0788 - val_accuracy: 0.6845
Epoch 21/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.4757 - accuracy: 0.8318 - val_loss: 1.0525 - val_accuracy: 0.6884
Epoch 22/50
 995/1000 [============================>.] - ETA: 0s - loss: 0.4555 - accuracy: 0.8371
Running privacy report for epoch: 22

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.4555 - accuracy: 0.8372 - val_loss: 1.1266 - val_accuracy: 0.6789
Epoch 23/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.4419 - accuracy: 0.8426 - val_loss: 1.1366 - val_accuracy: 0.6810
Epoch 24/50
 999/1000 [============================>.] - ETA: 0s - loss: 0.4257 - accuracy: 0.8496
Running privacy report for epoch: 24

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.4255 - accuracy: 0.8497 - val_loss: 1.1842 - val_accuracy: 0.6847
Epoch 25/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.4150 - accuracy: 0.8523 - val_loss: 1.1718 - val_accuracy: 0.6742
Epoch 26/50
 990/1000 [============================>.] - ETA: 0s - loss: 0.3990 - accuracy: 0.8575
Running privacy report for epoch: 26

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.3995 - accuracy: 0.8573 - val_loss: 1.2563 - val_accuracy: 0.6737
Epoch 27/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.3846 - accuracy: 0.8633 - val_loss: 1.2504 - val_accuracy: 0.6758
Epoch 28/50
 992/1000 [============================>.] - ETA: 0s - loss: 0.3685 - accuracy: 0.8681
Running privacy report for epoch: 28

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.3691 - accuracy: 0.8677 - val_loss: 1.2698 - val_accuracy: 0.6746
Epoch 29/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.3581 - accuracy: 0.8709 - val_loss: 1.3221 - val_accuracy: 0.6726
Epoch 30/50
 992/1000 [============================>.] - ETA: 0s - loss: 0.3442 - accuracy: 0.8774
Running privacy report for epoch: 30

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.3440 - accuracy: 0.8776 - val_loss: 1.3500 - val_accuracy: 0.6683
Epoch 31/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.3336 - accuracy: 0.8820 - val_loss: 1.3992 - val_accuracy: 0.6707
Epoch 32/50
 993/1000 [============================>.] - ETA: 0s - loss: 0.3202 - accuracy: 0.8853
Running privacy report for epoch: 32

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.3208 - accuracy: 0.8851 - val_loss: 1.4398 - val_accuracy: 0.6706
Epoch 33/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.3097 - accuracy: 0.8880 - val_loss: 1.4843 - val_accuracy: 0.6628
Epoch 34/50
 990/1000 [============================>.] - ETA: 0s - loss: 0.2979 - accuracy: 0.8912
Running privacy report for epoch: 34

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.2988 - accuracy: 0.8907 - val_loss: 1.5285 - val_accuracy: 0.6719
Epoch 35/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.2908 - accuracy: 0.8967 - val_loss: 1.5647 - val_accuracy: 0.6659
Epoch 36/50
 997/1000 [============================>.] - ETA: 0s - loss: 0.2828 - accuracy: 0.8973
Running privacy report for epoch: 36

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.2830 - accuracy: 0.8972 - val_loss: 1.5845 - val_accuracy: 0.6631
Epoch 37/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.2676 - accuracy: 0.9029 - val_loss: 1.6956 - val_accuracy: 0.6680
Epoch 38/50
 993/1000 [============================>.] - ETA: 0s - loss: 0.2659 - accuracy: 0.9029
Running privacy report for epoch: 38

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.2660 - accuracy: 0.9028 - val_loss: 1.6532 - val_accuracy: 0.6626
Epoch 39/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.2500 - accuracy: 0.9092 - val_loss: 1.7144 - val_accuracy: 0.6626
Epoch 40/50
 998/1000 [============================>.] - ETA: 0s - loss: 0.2427 - accuracy: 0.9112
Running privacy report for epoch: 40

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.2428 - accuracy: 0.9112 - val_loss: 1.7461 - val_accuracy: 0.6571
Epoch 41/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.2423 - accuracy: 0.9127 - val_loss: 1.8465 - val_accuracy: 0.6573
Epoch 42/50
 993/1000 [============================>.] - ETA: 0s - loss: 0.2280 - accuracy: 0.9174
Running privacy report for epoch: 42

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.2284 - accuracy: 0.9172 - val_loss: 1.8395 - val_accuracy: 0.6631
Epoch 43/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.2206 - accuracy: 0.9199 - val_loss: 1.9405 - val_accuracy: 0.6530
Epoch 44/50
 986/1000 [============================>.] - ETA: 0s - loss: 0.2154 - accuracy: 0.9216
Running privacy report for epoch: 44

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.2160 - accuracy: 0.9215 - val_loss: 1.9817 - val_accuracy: 0.6572
Epoch 45/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.2124 - accuracy: 0.9232 - val_loss: 2.0441 - val_accuracy: 0.6493
Epoch 46/50
 993/1000 [============================>.] - ETA: 0s - loss: 0.1956 - accuracy: 0.9285
Running privacy report for epoch: 46

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.1955 - accuracy: 0.9286 - val_loss: 2.0261 - val_accuracy: 0.6555
Epoch 47/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.1977 - accuracy: 0.9272 - val_loss: 2.0371 - val_accuracy: 0.6574
Epoch 48/50
 988/1000 [============================>.] - ETA: 0s - loss: 0.1889 - accuracy: 0.9309
Running privacy report for epoch: 48

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.1890 - accuracy: 0.9309 - val_loss: 2.1794 - val_accuracy: 0.6586
Epoch 49/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.1870 - accuracy: 0.9327 - val_loss: 2.2243 - val_accuracy: 0.6550
Epoch 50/50
 991/1000 [============================>.] - ETA: 0s - loss: 0.1866 - accuracy: 0.9315
Running privacy report for epoch: 50

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.1867 - accuracy: 0.9315 - val_loss: 2.1728 - val_accuracy: 0.6549
callback = PrivacyMetrics(epochs_per_report, "3 Layers")
history = model_3layers.fit(
      x_train,
      y_train,
      batch_size=batch_size,
      epochs=total_epochs,
      validation_data=(x_test, y_test),
      callbacks=[callback],
      shuffle=True)

all_reports.extend(callback.attack_results)
Epoch 1/50
1000/1000 [==============================] - 5s 5ms/step - loss: 1.6622 - accuracy: 0.3885 - val_loss: 1.3936 - val_accuracy: 0.4988
Epoch 2/50
 991/1000 [============================>.] - ETA: 0s - loss: 1.3397 - accuracy: 0.5184
Running privacy report for epoch: 2

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 1.3395 - accuracy: 0.5184 - val_loss: 1.2703 - val_accuracy: 0.5474
Epoch 3/50
1000/1000 [==============================] - 4s 4ms/step - loss: 1.2186 - accuracy: 0.5687 - val_loss: 1.1998 - val_accuracy: 0.5741
Epoch 4/50
 995/1000 [============================>.] - ETA: 0s - loss: 1.1386 - accuracy: 0.5991
Running privacy report for epoch: 4

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 1.1388 - accuracy: 0.5990 - val_loss: 1.1223 - val_accuracy: 0.6008
Epoch 5/50
1000/1000 [==============================] - 4s 4ms/step - loss: 1.0791 - accuracy: 0.6207 - val_loss: 1.1642 - val_accuracy: 0.5992
Epoch 6/50
 996/1000 [============================>.] - ETA: 0s - loss: 1.0321 - accuracy: 0.6379
Running privacy report for epoch: 6

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 1.0326 - accuracy: 0.6378 - val_loss: 1.1127 - val_accuracy: 0.6076
Epoch 7/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.9949 - accuracy: 0.6511 - val_loss: 1.0516 - val_accuracy: 0.6301
Epoch 8/50
 987/1000 [============================>.] - ETA: 0s - loss: 0.9630 - accuracy: 0.6624
Running privacy report for epoch: 8

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.9630 - accuracy: 0.6624 - val_loss: 1.0033 - val_accuracy: 0.6523
Epoch 9/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.9312 - accuracy: 0.6753 - val_loss: 0.9949 - val_accuracy: 0.6537
Epoch 10/50
 988/1000 [============================>.] - ETA: 0s - loss: 0.9041 - accuracy: 0.6852
Running privacy report for epoch: 10

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.9059 - accuracy: 0.6847 - val_loss: 1.0499 - val_accuracy: 0.6337
Epoch 11/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.8824 - accuracy: 0.6903 - val_loss: 0.9996 - val_accuracy: 0.6607
Epoch 12/50
 997/1000 [============================>.] - ETA: 0s - loss: 0.8583 - accuracy: 0.6985
Running privacy report for epoch: 12

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.8583 - accuracy: 0.6986 - val_loss: 0.9458 - val_accuracy: 0.6766
Epoch 13/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.8372 - accuracy: 0.7051 - val_loss: 0.9338 - val_accuracy: 0.6774
Epoch 14/50
 987/1000 [============================>.] - ETA: 0s - loss: 0.8220 - accuracy: 0.7095
Running privacy report for epoch: 14

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.8218 - accuracy: 0.7096 - val_loss: 0.9811 - val_accuracy: 0.6636
Epoch 15/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.8092 - accuracy: 0.7135 - val_loss: 0.9274 - val_accuracy: 0.6764
Epoch 16/50
1000/1000 [==============================] - ETA: 0s - loss: 0.7925 - accuracy: 0.7212
Running privacy report for epoch: 16

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.7925 - accuracy: 0.7212 - val_loss: 0.9228 - val_accuracy: 0.6864
Epoch 17/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.7766 - accuracy: 0.7274 - val_loss: 0.9301 - val_accuracy: 0.6828
Epoch 18/50
 995/1000 [============================>.] - ETA: 0s - loss: 0.7609 - accuracy: 0.7326
Running privacy report for epoch: 18

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.7604 - accuracy: 0.7327 - val_loss: 0.9212 - val_accuracy: 0.6893
Epoch 19/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.7478 - accuracy: 0.7367 - val_loss: 0.9407 - val_accuracy: 0.6759
Epoch 20/50
 994/1000 [============================>.] - ETA: 0s - loss: 0.7356 - accuracy: 0.7418
Running privacy report for epoch: 20

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.7353 - accuracy: 0.7419 - val_loss: 0.8988 - val_accuracy: 0.6936
Epoch 21/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.7203 - accuracy: 0.7450 - val_loss: 0.9204 - val_accuracy: 0.6876
Epoch 22/50
 995/1000 [============================>.] - ETA: 0s - loss: 0.7135 - accuracy: 0.7480
Running privacy report for epoch: 22

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.7137 - accuracy: 0.7480 - val_loss: 0.8950 - val_accuracy: 0.6993
Epoch 23/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.7005 - accuracy: 0.7521 - val_loss: 0.9183 - val_accuracy: 0.6878
Epoch 24/50
 996/1000 [============================>.] - ETA: 0s - loss: 0.6930 - accuracy: 0.7545
Running privacy report for epoch: 24

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.6927 - accuracy: 0.7547 - val_loss: 0.9101 - val_accuracy: 0.6933
Epoch 25/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.6783 - accuracy: 0.7614 - val_loss: 0.9272 - val_accuracy: 0.6903
Epoch 26/50
 996/1000 [============================>.] - ETA: 0s - loss: 0.6725 - accuracy: 0.7621
Running privacy report for epoch: 26

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.6726 - accuracy: 0.7620 - val_loss: 0.9101 - val_accuracy: 0.6953
Epoch 27/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.6650 - accuracy: 0.7640 - val_loss: 0.9073 - val_accuracy: 0.6990
Epoch 28/50
 988/1000 [============================>.] - ETA: 0s - loss: 0.6508 - accuracy: 0.7686
Running privacy report for epoch: 28

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.6512 - accuracy: 0.7684 - val_loss: 0.9356 - val_accuracy: 0.6964
Epoch 29/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.6489 - accuracy: 0.7705 - val_loss: 0.9433 - val_accuracy: 0.6871
Epoch 30/50
 998/1000 [============================>.] - ETA: 0s - loss: 0.6402 - accuracy: 0.7746
Running privacy report for epoch: 30

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.6406 - accuracy: 0.7744 - val_loss: 0.9736 - val_accuracy: 0.6830
Epoch 31/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.6303 - accuracy: 0.7764 - val_loss: 0.9283 - val_accuracy: 0.6970
Epoch 32/50
1000/1000 [==============================] - ETA: 0s - loss: 0.6295 - accuracy: 0.7769
Running privacy report for epoch: 32

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.6295 - accuracy: 0.7769 - val_loss: 0.9491 - val_accuracy: 0.6937
Epoch 33/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.6176 - accuracy: 0.7816 - val_loss: 0.9748 - val_accuracy: 0.6814
Epoch 34/50
 998/1000 [============================>.] - ETA: 0s - loss: 0.6109 - accuracy: 0.7819
Running privacy report for epoch: 34

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.6109 - accuracy: 0.7819 - val_loss: 0.9126 - val_accuracy: 0.7043
Epoch 35/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.6044 - accuracy: 0.7853 - val_loss: 0.9606 - val_accuracy: 0.6882
Epoch 36/50
1000/1000 [==============================] - ETA: 0s - loss: 0.5973 - accuracy: 0.7872
Running privacy report for epoch: 36

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 17s 17ms/step - loss: 0.5973 - accuracy: 0.7872 - val_loss: 0.9387 - val_accuracy: 0.6986
Epoch 37/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.5929 - accuracy: 0.7891 - val_loss: 0.9987 - val_accuracy: 0.6796
Epoch 38/50
 992/1000 [============================>.] - ETA: 0s - loss: 0.5906 - accuracy: 0.7888
Running privacy report for epoch: 38

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 16s 16ms/step - loss: 0.5905 - accuracy: 0.7888 - val_loss: 0.9477 - val_accuracy: 0.6983
Epoch 39/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.5830 - accuracy: 0.7926 - val_loss: 0.9519 - val_accuracy: 0.6956
Epoch 40/50
 998/1000 [============================>.] - ETA: 0s - loss: 0.5709 - accuracy: 0.7960
Running privacy report for epoch: 40

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 17s 17ms/step - loss: 0.5708 - accuracy: 0.7961 - val_loss: 0.9457 - val_accuracy: 0.7011
Epoch 41/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.5739 - accuracy: 0.7957 - val_loss: 0.9750 - val_accuracy: 0.6920
Epoch 42/50
 988/1000 [============================>.] - ETA: 0s - loss: 0.5659 - accuracy: 0.7978
Running privacy report for epoch: 42

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 17s 17ms/step - loss: 0.5657 - accuracy: 0.7979 - val_loss: 0.9787 - val_accuracy: 0.6962
Epoch 43/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.5675 - accuracy: 0.7983 - val_loss: 0.9364 - val_accuracy: 0.7019
Epoch 44/50
 999/1000 [============================>.] - ETA: 0s - loss: 0.5586 - accuracy: 0.7995
Running privacy report for epoch: 44

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 17s 17ms/step - loss: 0.5587 - accuracy: 0.7994 - val_loss: 0.9897 - val_accuracy: 0.6904
Epoch 45/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.5460 - accuracy: 0.8050 - val_loss: 0.9731 - val_accuracy: 0.7007
Epoch 46/50
 995/1000 [============================>.] - ETA: 0s - loss: 0.5481 - accuracy: 0.8040
Running privacy report for epoch: 46

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 17s 17ms/step - loss: 0.5485 - accuracy: 0.8039 - val_loss: 0.9999 - val_accuracy: 0.6927
Epoch 47/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.5434 - accuracy: 0.8063 - val_loss: 0.9846 - val_accuracy: 0.6938
Epoch 48/50
 995/1000 [============================>.] - ETA: 0s - loss: 0.5334 - accuracy: 0.8093
Running privacy report for epoch: 48

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 17s 17ms/step - loss: 0.5335 - accuracy: 0.8091 - val_loss: 1.0544 - val_accuracy: 0.6769
Epoch 49/50
1000/1000 [==============================] - 4s 4ms/step - loss: 0.5336 - accuracy: 0.8095 - val_loss: 1.0303 - val_accuracy: 0.6904
Epoch 50/50
 989/1000 [============================>.] - ETA: 0s - loss: 0.5245 - accuracy: 0.8126
Running privacy report for epoch: 50

1000/1000 [==============================] - 2s 2ms/step
200/200 [==============================] - 0s 2ms/step
1000/1000 [==============================] - 17s 17ms/step - loss: 0.5249 - accuracy: 0.8126 - val_loss: 0.9858 - val_accuracy: 0.7030

Epoch Plots

You can visualize how privacy risks happen as you train models by probing the model periodically (e.g. every 5 epochs), you can pick the point in time with the best performance / privacy trade-off.

Use the TF Privacy Membership Inference Attack module to generate AttackResults. These AttackResults get combined into an AttackResultsCollection. The TF Privacy Report is designed to analyze the provided AttackResultsCollection.

results = AttackResultsCollection(all_reports)
privacy_metrics = (PrivacyMetric.AUC, PrivacyMetric.ATTACKER_ADVANTAGE)
epoch_plot = privacy_report.plot_by_epochs(
    results, privacy_metrics=privacy_metrics)

png

See that as a rule, privacy vulnerability tends to increase as the number of epochs goes up. This is true across model variants as well as different attacker types.

Two layer models (with fewer convolutional layers) are generally more vulnerable than their three layer model counterparts.

Now let's see how model performance changes with respect to privacy risk.

Privacy vs Utility

privacy_metrics = (PrivacyMetric.AUC, PrivacyMetric.ATTACKER_ADVANTAGE)
utility_privacy_plot = privacy_report.plot_privacy_vs_accuracy(
    results, privacy_metrics=privacy_metrics)

for axis in utility_privacy_plot.axes:
  axis.set_xlabel('Validation accuracy')

png

Three layer models (perhaps due to too many parameters) only achieve a train accuracy of 0.85. The two layer models achieve roughly equal performance for that level of privacy risk but they continue to get better accuracy.

You can also see how the line for two layer models gets steeper. This means that additional marginal gains in train accuracy come at an expense of vast privacy vulnerabilities.

This is the end of the tutorial. Feel free to analyze your own results.