Save and load a model using a distribution strategy

View on TensorFlow.org Run in Google Colab View source on GitHub Download notebook

Overview

This tutorial demonstrates how you can save and load models in a SavedModel format with tf.distribute.Strategy during or after training. There are two kinds of APIs for saving and loading a Keras model: high-level (tf.keras.Model.save and tf.keras.models.load_model) and low-level (tf.saved_model.save and tf.saved_model.load).

To learn about SavedModel and serialization in general, please read the saved model guide, and the Keras model serialization guide. Let's start with a simple example.

Import dependencies:

import tensorflow_datasets as tfds

import tensorflow as tf
2023-12-07 02:55:30.595901: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:9261] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2023-12-07 02:55:30.595947: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:607] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2023-12-07 02:55:30.597658: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1515] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered

Load and prepare the data with TensorFlow Datasets and tf.data, and create the model using tf.distribute.MirroredStrategy:

mirrored_strategy = tf.distribute.MirroredStrategy()

def get_data():
  datasets = tfds.load(name='mnist', as_supervised=True)
  mnist_train, mnist_test = datasets['train'], datasets['test']

  BUFFER_SIZE = 10000

  BATCH_SIZE_PER_REPLICA = 64
  BATCH_SIZE = BATCH_SIZE_PER_REPLICA * mirrored_strategy.num_replicas_in_sync

  def scale(image, label):
    image = tf.cast(image, tf.float32)
    image /= 255

    return image, label

  train_dataset = mnist_train.map(scale).cache().shuffle(BUFFER_SIZE).batch(BATCH_SIZE)
  eval_dataset = mnist_test.map(scale).batch(BATCH_SIZE)

  return train_dataset, eval_dataset

def get_model():
  with mirrored_strategy.scope():
    model = tf.keras.Sequential([
        tf.keras.layers.Conv2D(32, 3, activation='relu', input_shape=(28, 28, 1)),
        tf.keras.layers.MaxPooling2D(),
        tf.keras.layers.Flatten(),
        tf.keras.layers.Dense(64, activation='relu'),
        tf.keras.layers.Dense(10)
    ])

    model.compile(loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
                  optimizer=tf.keras.optimizers.Adam(),
                  metrics=[tf.metrics.SparseCategoricalAccuracy()])
    return model
INFO:tensorflow:Using MirroredStrategy with devices ('/job:localhost/replica:0/task:0/device:GPU:0', '/job:localhost/replica:0/task:0/device:GPU:1', '/job:localhost/replica:0/task:0/device:GPU:2', '/job:localhost/replica:0/task:0/device:GPU:3')

Train the model with tf.keras.Model.fit:

model = get_model()
train_dataset, eval_dataset = get_data()
model.fit(train_dataset, epochs=2)
2023-12-07 02:55:36.757289: W tensorflow/core/grappler/optimizers/data/auto_shard.cc:553] The `assert_cardinality` transformation is currently not handled by the auto-shard rewrite and will be removed.
Epoch 1/2
INFO:tensorflow:Collective all_reduce tensors: 6 all_reduces, num_devices = 4, group_size = 4, implementation = CommunicationImplementation.NCCL, num_packs = 1
INFO:tensorflow:Collective all_reduce tensors: 6 all_reduces, num_devices = 4, group_size = 4, implementation = CommunicationImplementation.NCCL, num_packs = 1
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Collective all_reduce tensors: 6 all_reduces, num_devices = 4, group_size = 4, implementation = CommunicationImplementation.NCCL, num_packs = 1
INFO:tensorflow:Collective all_reduce tensors: 6 all_reduces, num_devices = 4, group_size = 4, implementation = CommunicationImplementation.NCCL, num_packs = 1
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1701917743.242889   59502 device_compiler.h:186] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.
235/235 [==============================] - ETA: 0s - loss: 0.3346 - sparse_categorical_accuracy: 0.9077INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
INFO:tensorflow:Reduce to /job:localhost/replica:0/task:0/device:CPU:0 then broadcast to ('/job:localhost/replica:0/task:0/device:CPU:0',).
235/235 [==============================] - 8s 8ms/step - loss: 0.3346 - sparse_categorical_accuracy: 0.9077
Epoch 2/2
235/235 [==============================] - 2s 7ms/step - loss: 0.1073 - sparse_categorical_accuracy: 0.9689
<keras.src.callbacks.History at 0x7f09080e6580>

Save and load the model

Now that you have a simple model to work with, let's explore the saving/loading APIs. There are two kinds of APIs available:

The Keras API

Here is an example of saving and loading a model with the Keras API:

keras_model_path = '/tmp/keras_save.keras'
model.save(keras_model_path)

Restore the model without tf.distribute.Strategy:

restored_keras_model = tf.keras.models.load_model(keras_model_path)
restored_keras_model.fit(train_dataset, epochs=2)
Epoch 1/2
235/235 [==============================] - 2s 4ms/step - loss: 0.0704 - sparse_categorical_accuracy: 0.9798
Epoch 2/2
235/235 [==============================] - 1s 4ms/step - loss: 0.0552 - sparse_categorical_accuracy: 0.9840
<keras.src.callbacks.History at 0x7f0a670e86a0>

After restoring the model, you can continue training on it, even without needing to call Model.compile again, since it was already compiled before saving. The model is saved a Keras zip archive format, marked by the .keras extension. For more information, please refer to the guide on Keras saving.

Now, restore the model and train it using a tf.distribute.Strategy:

another_strategy = tf.distribute.OneDeviceStrategy('/cpu:0')
with another_strategy.scope():
  restored_keras_model_ds = tf.keras.models.load_model(keras_model_path)
  restored_keras_model_ds.fit(train_dataset, epochs=2)
Epoch 1/2
2023-12-07 02:55:51.094979: W tensorflow/core/grappler/optimizers/data/auto_shard.cc:553] The `assert_cardinality` transformation is currently not handled by the auto-shard rewrite and will be removed.
2023-12-07 02:55:51.150704: W tensorflow/core/framework/dataset.cc:959] Input of GeneratorDatasetOp::Dataset will not be optimized because the dataset does not implement the AsGraphDefInternal() method needed to apply optimizations.
14/235 [>.............................] - ETA: 2s - loss: 0.0909 - sparse_categorical_accuracy: 0.9727
2023-12-07 02:55:51.715329: E external/local_xla/xla/stream_executor/stream_executor_internal.h:177] SetPriority unimplemented for this stream.
2023-12-07 02:55:51.739309: E external/local_xla/xla/stream_executor/stream_executor_internal.h:177] SetPriority unimplemented for this stream.
2023-12-07 02:55:51.754101: E external/local_xla/xla/stream_executor/stream_executor_internal.h:177] SetPriority unimplemented for this stream.
235/235 [==============================] - 3s 12ms/step - loss: 0.0711 - sparse_categorical_accuracy: 0.9795
Epoch 2/2
235/235 [==============================] - 3s 11ms/step - loss: 0.0541 - sparse_categorical_accuracy: 0.9844

As the Model.fit output shows, loading works as expected with tf.distribute.Strategy. The strategy used here does not have to be the same strategy used before saving.

The tf.saved_model API

Saving the model with lower-level API is similar to the Keras API:

model = get_model()  # get a fresh model
saved_model_path = '/tmp/tf_save'
tf.saved_model.save(model, saved_model_path)
INFO:tensorflow:Assets written to: /tmp/tf_save/assets
INFO:tensorflow:Assets written to: /tmp/tf_save/assets

Loading can be done with tf.saved_model.load. However, since it is a lower-level API (and hence has a wider range of use cases), it does not return a Keras model. Instead, it returns an object that contain functions that can be used to do inference. For example:

DEFAULT_FUNCTION_KEY = 'serving_default'
loaded = tf.saved_model.load(saved_model_path)
inference_func = loaded.signatures[DEFAULT_FUNCTION_KEY]

The loaded object may contain multiple functions, each associated with a key. The "serving_default" key is the default key for the inference function with a saved Keras model. To do inference with this function:

predict_dataset = eval_dataset.map(lambda image, label: image)
for batch in predict_dataset.take(1):
  print(inference_func(batch))
{'dense_3': <tf.Tensor: shape=(256, 10), dtype=float32, numpy=
array([[-0.08887047,  0.23013261,  0.0797164 , ...,  0.08633086,
         0.20132771,  0.07340094],
       [ 0.0070231 ,  0.13520403,  0.16215095, ...,  0.18192686,
         0.09719309, -0.07021746],
       [-0.09772085,  0.15667503,  0.05740268, ...,  0.02690476,
         0.08821856, -0.065983  ],
       ...,
       [ 0.03331229,  0.21392596,  0.06844639, ...,  0.11100966,
         0.13067657,  0.06560507],
       [ 0.02533158,  0.28351632,  0.06080831, ...,  0.00390072,
         0.22675292, -0.01095095],
       [ 0.00029172,  0.09733333, -0.05236932, ...,  0.03921675,
         0.16167443, -0.03282646]], dtype=float32)>}
2023-12-07 02:55:58.135107: W tensorflow/core/kernels/data/cache_dataset_ops.cc:858] The calling iterator did not fully read the dataset being cached. In order to avoid unexpected truncation of the dataset, the partially cached contents of the dataset  will be discarded. This can happen if you have an input pipeline similar to `dataset.cache().take(k).repeat()`. You should use `dataset.take(k).cache().repeat()` instead.

You can also load and do inference in a distributed manner:

another_strategy = tf.distribute.MirroredStrategy()
with another_strategy.scope():
  loaded = tf.saved_model.load(saved_model_path)
  inference_func = loaded.signatures[DEFAULT_FUNCTION_KEY]

  dist_predict_dataset = another_strategy.experimental_distribute_dataset(
      predict_dataset)

  # Calling the function in a distributed manner
  for batch in dist_predict_dataset:
    result = another_strategy.run(inference_func, args=(batch,))
    print(result)
    break
INFO:tensorflow:Using MirroredStrategy with devices ('/job:localhost/replica:0/task:0/device:GPU:0', '/job:localhost/replica:0/task:0/device:GPU:1', '/job:localhost/replica:0/task:0/device:GPU:2', '/job:localhost/replica:0/task:0/device:GPU:3')
INFO:tensorflow:Using MirroredStrategy with devices ('/job:localhost/replica:0/task:0/device:GPU:0', '/job:localhost/replica:0/task:0/device:GPU:1', '/job:localhost/replica:0/task:0/device:GPU:2', '/job:localhost/replica:0/task:0/device:GPU:3')
2023-12-07 02:55:58.353164: W tensorflow/core/grappler/optimizers/data/auto_shard.cc:553] The `assert_cardinality` transformation is currently not handled by the auto-shard rewrite and will be removed.
WARNING:tensorflow:Using MirroredStrategy eagerly has significant overhead currently. We will be working on improving this in the future, but for now please wrap `call_for_each_replica` or `experimental_run` or `run` inside a tf.function to get the best performance.
WARNING:tensorflow:Using MirroredStrategy eagerly has significant overhead currently. We will be working on improving this in the future, but for now please wrap `call_for_each_replica` or `experimental_run` or `run` inside a tf.function to get the best performance.
{'dense_3': PerReplica:{
  0: <tf.Tensor: shape=(64, 10), dtype=float32, numpy=
array([[-8.88704881e-02,  2.30132610e-01,  7.97163993e-02,
        -1.31759912e-01,  5.91161437e-02, -2.35988870e-02,
        -1.60560980e-02,  8.63307863e-02,  2.01327711e-01,
         7.34009147e-02],
       [ 7.02308491e-03,  1.35204017e-01,  1.62150949e-01,
         5.45714051e-03, -8.19581971e-02, -1.28377676e-01,
        -1.75109625e-01,  1.81926847e-01,  9.71930698e-02,
        -7.02174604e-02],
       [-9.77208465e-02,  1.56675026e-01,  5.74026853e-02,
         1.28947154e-01, -6.59208931e-03, -1.78390861e-01,
        -1.59435079e-01,  2.69047376e-02,  8.82185623e-02,
        -6.59829825e-02],
       [ 2.16655228e-02,  2.05600053e-01,  1.49781272e-01,
        -7.34937191e-02,  3.25676724e-02,  1.05896965e-04,
        -1.70909703e-01,  1.32460907e-01,  2.13127762e-01,
        -4.64034565e-02],
       [ 9.64792818e-02,  1.17056042e-01,  6.28481880e-02,
        -6.22711629e-02, -9.17727649e-02, -4.82694507e-02,
        -9.59253609e-02,  1.09380744e-01,  2.14025393e-01,
        -8.65713954e-02],
       [-5.39975539e-02,  1.56000584e-01, -9.07104462e-04,
        -3.56559716e-02, -1.23577997e-01, -1.52911454e-01,
        -5.91984503e-02,  1.27743743e-02,  1.61903054e-01,
        -8.64146352e-02],
       [-4.30953279e-02,  1.79094255e-01,  1.93781346e-01,
        -5.38370013e-03, -3.87832895e-02, -1.22728840e-01,
        -1.19160652e-01,  1.65361613e-01,  1.28956184e-01,
        -8.28718469e-02],
       [-1.93288811e-02,  1.80789888e-01,  3.21645476e-02,
         6.02822937e-02,  3.74666080e-02, -9.38865840e-02,
        -1.05801687e-01, -5.42016625e-02,  1.96415976e-01,
        -1.02048755e-01],
       [-7.14831799e-02,  2.98752815e-01,  3.04811187e-02,
         6.00802600e-02,  4.30936962e-02, -1.14774525e-01,
        -1.29080728e-01, -1.23890266e-02,  2.48158753e-01,
        -7.78015479e-02],
       [ 2.56570932e-02,  1.54194921e-01, -2.26259604e-03,
         2.40820646e-02,  4.35544588e-02,  5.09924553e-02,
        -9.86901671e-02,  3.44942808e-02,  2.38043562e-01,
        -9.16148648e-02],
       [-4.65398431e-02,  1.99837998e-01, -5.37484698e-03,
         2.78427787e-02, -5.26580028e-02,  3.66954878e-02,
        -1.83375061e-01,  1.14492722e-01,  1.86010078e-01,
         5.76798618e-02],
       [-4.19992581e-03,  2.35597402e-01,  5.37037253e-02,
         7.01894388e-02, -2.43620500e-02, -1.21807575e-01,
        -9.00418162e-02,  9.66826230e-02,  2.69639015e-01,
        -1.07540824e-01],
       [ 4.97138873e-03,  1.05807625e-01, -4.65683863e-02,
        -1.92540158e-02, -8.23844224e-03,  9.86031070e-03,
        -5.24727181e-02,  5.27931340e-02,  9.07213166e-02,
        -1.22235510e-02],
       [ 4.81171273e-02,  1.81037858e-01,  4.05417420e-02,
        -9.91072729e-02, -3.97325568e-02, -5.61009124e-02,
        -7.45836496e-02,  1.30811930e-01,  2.06926614e-01,
         1.05081806e-02],
       [ 1.71849709e-02,  1.65046155e-01,  1.75144061e-01,
        -1.18777014e-01, -6.94098771e-02, -9.69980359e-02,
        -6.75017685e-02,  1.48439988e-01,  1.34415746e-01,
         7.80127943e-04],
       [ 2.59486139e-02,  2.21643537e-01,  1.00596316e-01,
         1.40125006e-02, -7.76839107e-02, -1.86180234e-01,
        -1.43258169e-01,  1.22368291e-01,  2.26900399e-01,
        -5.85426539e-02],
       [-2.79391371e-02,  2.39203379e-01,  4.20249254e-03,
         1.55405514e-02, -7.48042613e-02, -7.64854699e-02,
        -9.82603133e-02,  5.89012168e-02,  2.85887003e-01,
        -9.93882045e-02],
       [ 3.21437269e-02,  1.13911301e-01, -5.99174313e-02,
         3.46241631e-02, -8.38647559e-02, -2.78746448e-02,
        -1.60262108e-01, -9.15522873e-03,  2.67240196e-01,
        -7.22550154e-02],
       [-2.07619946e-02,  2.12707028e-01, -1.80977304e-02,
         6.41359016e-02,  5.63077815e-03, -6.00823238e-02,
        -1.04347296e-01,  1.17185846e-01,  1.08854741e-01,
        -3.58489435e-03],
       [ 5.23851700e-02,  2.50586927e-01,  1.19460821e-01,
        -2.49402188e-02, -1.70825049e-04, -1.14583448e-01,
        -1.04910739e-01,  7.51477182e-02,  2.37119481e-01,
        -4.17280048e-02],
       [-1.74745563e-02,  5.56242168e-02,  5.41662872e-02,
        -7.96638802e-02, -1.33875132e-01, -3.16801891e-02,
        -5.69622330e-02,  5.23099788e-02,  1.19262792e-01,
         4.83310670e-02],
       [-5.41316494e-02,  8.50628093e-02, -4.75199446e-02,
        -3.61266136e-02, -1.35292381e-01, -4.67167888e-03,
        -2.01825276e-02,  2.49227732e-02,  9.36377347e-02,
         5.34324646e-02],
       [-3.96343768e-02,  1.68552995e-01,  3.42692509e-02,
        -4.32555601e-02, -4.43378761e-02, -2.63174362e-02,
        -1.31963670e-01,  1.28319487e-03,  1.71266347e-01,
        -1.44029379e-01],
       [ 5.31072430e-02,  1.28928527e-01,  1.14410758e-01,
        -9.04086530e-02, -4.93204817e-02, -1.49050817e-01,
        -1.43637896e-01,  8.40329900e-02,  1.40111178e-01,
         1.77023821e-02],
       [-4.70685177e-02,  3.03266406e-01,  6.64562583e-02,
         7.97615498e-02,  1.81862153e-02, -1.11099362e-01,
        -1.01991251e-01,  3.26305293e-02,  3.35743964e-01,
        -1.17412269e-01],
       [-3.41654122e-02,  2.48600826e-01,  9.49501395e-02,
         5.66002242e-02,  8.43224376e-02,  4.82594334e-02,
        -1.52712882e-01,  5.46806045e-02,  2.51453042e-01,
        -3.42634469e-02],
       [-3.05493921e-02,  1.35679305e-01, -1.50822559e-02,
        -3.72446608e-03, -6.85828105e-02,  6.48390688e-03,
        -9.98830050e-02, -2.88211815e-02,  1.83473781e-01,
        -2.59675942e-02],
       [-2.27232538e-02,  1.62356213e-01,  5.94368652e-02,
        -2.40297243e-02, -6.10500686e-02, -1.05997764e-01,
        -1.24491788e-01,  4.06427495e-02,  1.71244085e-01,
        -1.88780837e-02],
       [ 8.04325491e-02,  1.18863545e-01,  1.03107378e-01,
        -5.92109412e-02, -6.20582923e-02, -3.69820818e-02,
        -9.61294472e-02,  1.52631775e-01,  2.21478105e-01,
        -2.05852017e-02],
       [-4.55004796e-02,  1.52021706e-01, -9.02570188e-02,
         2.82362644e-02,  2.96518989e-02, -1.26867160e-01,
        -3.21656130e-02,  2.91072056e-02,  1.40160650e-01,
        -7.54511282e-02],
       [-3.31863612e-02,  2.34491885e-01,  7.22116232e-02,
        -5.96575812e-02, -1.64905116e-02, -3.58286425e-02,
        -6.22555241e-02,  1.05175413e-01,  2.10750252e-01,
        -2.71668360e-02],
       [-4.80647124e-02,  2.54427612e-01,  5.08348420e-02,
         6.79004490e-02,  9.53983217e-02,  2.35783495e-02,
        -2.01066241e-01,  8.13199207e-04,  2.73426116e-01,
        -4.79283892e-02],
       [ 4.38134372e-02,  1.97824001e-01,  9.37048718e-02,
         5.14386669e-02,  3.50037590e-02, -2.94177420e-02,
        -2.12069705e-01,  1.59260631e-01,  2.65573055e-01,
         1.14829894e-02],
       [-4.19950262e-02,  6.05070442e-02, -2.75242589e-02,
        -5.43203987e-02, -3.74097005e-03, -4.29214649e-02,
        -1.64245516e-02,  2.45648548e-02,  3.12153734e-02,
        -3.23554203e-02],
       [-2.96387207e-02,  1.09718218e-01,  2.76632644e-02,
         4.75716218e-02, -5.55129759e-02, -8.24498534e-02,
        -8.70157033e-02,  9.30733606e-02,  7.40656257e-02,
        -3.99648659e-02],
       [-2.86504123e-02,  1.02041274e-01,  5.57509437e-02,
        -3.42401713e-02, -1.16843611e-01, -6.83691725e-02,
        -9.46359634e-02,  1.61994740e-01,  2.11606815e-01,
        -5.32061458e-02],
       [-5.88573590e-02,  2.23075807e-01,  3.66633087e-02,
        -3.18756253e-02, -3.43241915e-02, -2.11518593e-02,
        -1.40804946e-01,  5.78606315e-02,  3.23234797e-01,
        -1.86963491e-02],
       [-1.65050328e-02,  2.00599745e-01, -5.48870377e-02,
        -4.33592014e-02, -7.33016506e-02, -5.89493290e-03,
        -1.17137685e-01,  1.15768403e-01,  2.32111961e-01,
        -6.78773075e-02],
       [-1.13389552e-01,  1.57506794e-01,  5.62226586e-03,
        -7.19247907e-02, -6.13434464e-02, -3.35422046e-02,
        -7.33225420e-02,  6.62655011e-03,  1.49118900e-01,
        -4.17907834e-02],
       [-5.86860254e-03,  1.91537946e-01,  4.23699394e-02,
        -3.72688994e-02, -3.72069702e-03,  1.37861855e-02,
        -2.46599212e-01,  1.54583246e-01,  2.78501213e-01,
        -5.54198809e-02],
       [-2.64261980e-02,  1.66735947e-01,  9.54994708e-02,
         4.56302762e-02,  5.44268563e-02, -8.45203102e-02,
        -8.78380239e-02,  1.08903021e-01,  1.34874925e-01,
         2.04238705e-02],
       [-5.12586385e-02,  1.26435846e-01,  9.68881175e-02,
        -1.00092702e-01,  4.45076525e-02, -2.98473351e-02,
        -1.15547702e-01,  1.05937026e-01,  1.59049273e-01,
         2.54836436e-02],
       [-4.28452715e-03,  1.78668708e-01, -2.21592523e-02,
         1.76282711e-02,  4.35644239e-02, -7.40987137e-02,
        -1.31720603e-01,  2.19291169e-02,  1.81997418e-01,
         1.04534719e-03],
       [ 3.54464762e-02,  1.97295010e-01,  2.82976963e-02,
         2.18135957e-02, -2.56406628e-02, -1.59209937e-01,
        -1.25057444e-01, -2.04330813e-02,  3.74710083e-01,
        -1.22132987e-01],
       [ 3.84129696e-02,  4.34057936e-02,  9.06379372e-02,
        -8.77482295e-02, -7.84622431e-02, -1.28039345e-01,
        -1.34516641e-01,  6.36163801e-02,  1.53344974e-01,
        -4.64670360e-03],
       [ 6.52213544e-02,  2.39411145e-01,  8.85192454e-02,
        -8.81896615e-02,  1.90332457e-02, -1.57667510e-02,
        -2.51783818e-01,  1.00408472e-01,  3.98118824e-01,
        -6.51596487e-02],
       [-2.82137841e-02,  1.56042203e-01,  7.63502344e-03,
        -8.00859183e-02, -2.83642169e-02, -1.04706697e-02,
        -1.11042894e-01,  7.45246410e-02,  1.36308387e-01,
         4.86142077e-02],
       [ 4.78020199e-02,  1.90854907e-01,  1.23637617e-01,
         5.49917296e-02, -5.88456579e-02, -1.25443742e-01,
        -1.88018978e-01,  1.44914091e-01,  2.97868550e-01,
        -1.07774258e-01],
       [-5.48034906e-02,  2.58615911e-01,  7.53943026e-02,
        -6.73258603e-02, -1.49438791e-02, -1.23698145e-01,
        -1.08365171e-01,  5.56953475e-02,  3.60465199e-02,
         3.93672623e-02],
       [-2.08624154e-02,  2.21027136e-01,  8.77302662e-02,
         1.30157322e-02,  4.28613499e-02, -1.94252525e-02,
        -1.47303611e-01, -5.12621924e-03,  2.64993727e-01,
        -1.04629382e-01],
       [ 7.27507174e-02,  1.42975032e-01,  1.14339784e-01,
        -4.63483706e-02, -7.65028745e-02, -1.73112690e-01,
        -1.94150209e-01,  2.82158814e-02,  1.98822737e-01,
        -8.61721188e-02],
       [ 1.35628097e-02,  1.45482555e-01,  3.59519245e-03,
        -1.04989924e-01, -4.54886518e-02, -2.71261018e-02,
        -1.22762710e-01,  4.67288606e-02,  9.73764062e-02,
        -1.32130180e-03],
       [-7.46092349e-02,  9.97457281e-02, -5.11456504e-02,
        -7.45211635e-03, -5.04988432e-02, -8.33054930e-02,
        -3.78026515e-02,  4.92676795e-02,  3.41268703e-02,
         3.52215432e-02],
       [-3.17468196e-02,  5.34754209e-02,  1.06402207e-02,
         6.40960410e-04, -1.01131015e-03,  3.76125798e-02,
         8.90721381e-03,  9.92996395e-02,  1.19631335e-01,
         8.46377574e-03],
       [-4.29070070e-02,  1.66682810e-01,  5.55374622e-02,
        -2.35695355e-02, -1.13285013e-01, -1.12858072e-01,
        -7.72910789e-02,  7.78651312e-02,  1.21711172e-01,
         3.19669247e-02],
       [ 2.03136429e-02,  1.25631481e-01, -9.26656835e-03,
        -4.56816368e-02, -1.20289862e-01, -4.32695523e-02,
        -1.61196336e-01,  1.13487624e-01,  2.49118164e-01,
         1.21608339e-02],
       [ 3.32315895e-03,  1.74595833e-01,  3.82124260e-02,
         2.06133723e-02, -1.20551586e-02, -1.04094476e-01,
        -1.26771390e-01, -5.77198714e-03,  5.09706922e-02,
         8.19553342e-03],
       [-8.91042501e-03,  1.90974429e-01, -2.84911320e-03,
         7.53614120e-03, -1.92650966e-02, -2.47474760e-03,
        -1.80376440e-01,  7.57261291e-02,  1.81846023e-01,
         9.59356874e-03],
       [ 8.84026662e-03,  2.74324864e-01,  2.18873382e-01,
         1.27406083e-02, -8.92667919e-02, -5.18055819e-02,
        -1.21686965e-01,  1.39664635e-01,  1.83265358e-01,
         6.66669905e-02],
       [ 4.03853357e-02,  8.42173398e-02,  1.03173569e-01,
        -6.66518807e-02, -1.43823326e-01, -9.41174626e-02,
        -1.55184209e-01,  1.37323737e-01,  2.01231986e-01,
        -2.33778451e-02],
       [-5.41768074e-02,  9.61567163e-02,  9.23890620e-02,
        -2.92032678e-02, -8.70557427e-02, -6.48589954e-02,
        -7.71561712e-02,  9.39760059e-02,  1.21032454e-01,
         2.05874685e-02],
       [-8.27808902e-02,  1.65040478e-01,  1.31614447e-01,
         7.09264725e-03, -1.05191991e-01, -1.84163749e-01,
        -1.01721674e-01,  7.18215927e-02,  2.28015035e-01,
        -1.03416964e-01],
       [-4.41821478e-03,  1.60598204e-01,  5.83942123e-02,
        -7.73732886e-02, -2.07184345e-01, -9.67587829e-02,
        -1.78769320e-01,  4.26366180e-02,  1.28968775e-01,
        -1.45184875e-01],
       [-2.95644943e-02,  1.95396096e-01, -6.63764104e-02,
         1.57117527e-02,  7.00652599e-04, -7.93208480e-02,
        -6.11427426e-02,  5.71381152e-02,  8.71722549e-02,
         2.28793100e-02]], dtype=float32)>,
  1: <tf.Tensor: shape=(64, 10), dtype=float32, numpy=
array([[-5.09159714e-02,  1.61495715e-01,  3.81331146e-03,
        -1.44273192e-02,  7.25133568e-02,  8.49955380e-02,
        -7.97760114e-02,  1.34992465e-01,  2.20703140e-01,
        -1.26652688e-01],
       [ 4.96795923e-02,  9.15152058e-02,  1.42326653e-02,
        -7.08172470e-02, -1.30390480e-01, -4.68853861e-02,
        -1.74855560e-01,  9.39007849e-02,  2.06124932e-01,
        -3.28567848e-02],
       [ 4.21678200e-02,  3.06192487e-01,  1.24641821e-01,
         7.15717077e-02,  5.88258393e-02, -1.20189004e-01,
        -1.42864957e-01,  1.15060359e-01,  2.51617551e-01,
        -6.54844567e-02],
       [-1.10608004e-02,  6.67971373e-02, -4.57485318e-02,
        -8.70958120e-02, -6.15251511e-02,  8.36481303e-02,
        -1.12386614e-01,  4.47079390e-02,  1.29612848e-01,
        -2.93071792e-02],
       [-1.00493785e-02,  1.24295712e-01,  1.03982970e-01,
        -3.90187539e-02, -1.47076815e-01, -1.29700810e-01,
        -8.46860856e-02,  1.72995046e-01,  1.78693205e-01,
        -7.05212727e-02],
       [-4.62450385e-02,  1.03838585e-01, -4.00570482e-02,
        -3.22337635e-02, -1.06436826e-01, -9.71518084e-03,
        -9.04302448e-02,  6.58172742e-03,  1.20844252e-01,
        -2.96956934e-02],
       [-1.57408509e-02,  1.23843834e-01, -1.02321897e-02,
        -4.88225371e-02, -1.13734037e-01,  4.95544821e-03,
        -5.87291643e-02,  1.06274754e-01,  9.94351655e-02,
         4.81911600e-02],
       [-1.91956572e-02,  1.72932476e-01,  4.12430316e-02,
        -7.97511190e-02, -1.07957192e-01, -9.98106301e-02,
        -7.03799725e-02,  8.47652480e-02,  1.90048814e-01,
        -6.39570057e-02],
       [-7.99574107e-02,  2.25043580e-01,  1.34018194e-02,
         3.77700143e-02, -2.89577134e-02, -1.39760375e-01,
        -4.73189652e-02,  2.58347429e-02,  1.09978475e-01,
        -4.38720211e-02],
       [-1.16844200e-01,  7.88329691e-02, -5.02862744e-02,
        -1.08546391e-02, -7.31888860e-02, -1.49461672e-01,
         6.24300167e-03,  3.81931774e-02,  3.33622061e-02,
         2.18387693e-03],
       [ 2.87279580e-03,  1.45472586e-01,  3.56845185e-03,
        -4.47787270e-02, -2.64942087e-03,  4.43853028e-02,
        -5.29331937e-02,  5.00826836e-02,  1.72442436e-01,
        -5.35930991e-02],
       [-4.29417118e-02,  1.51917562e-01,  2.82599181e-02,
         1.09703438e-02, -3.70292403e-02, -3.14378664e-02,
        -8.74284357e-02, -9.95758921e-04,  2.38485426e-01,
        -3.79828513e-02],
       [-2.84351110e-02,  9.52413604e-02,  7.13959411e-02,
        -4.23385054e-02,  4.84900586e-02, -3.74953300e-02,
        -1.75616801e-01,  7.03140125e-02,  3.15487027e-01,
        -1.09349839e-01],
       [-6.28966317e-02,  2.65692919e-01,  2.63190903e-02,
        -6.97060376e-02, -1.13203935e-02, -1.27852589e-01,
        -9.27329436e-02,  5.96900322e-02,  1.56071424e-01,
        -1.00487515e-01],
       [-6.29385114e-02,  1.30474240e-01, -1.11788228e-01,
        -4.71146517e-02, -2.89445296e-02,  2.87789479e-02,
        -5.37965372e-02,  1.55779263e-02,  1.44202113e-01,
        -1.35459919e-02],
       [-2.69580372e-02,  1.69361144e-01, -5.89718148e-02,
         2.68209539e-03, -8.91328789e-03, -3.94639485e-02,
        -1.00642793e-01,  8.52307752e-02,  1.04575709e-01,
         3.35328951e-02],
       [ 3.83502617e-03,  2.04477996e-01,  1.37734160e-01,
        -9.09685567e-02, -8.01176950e-02, -1.34325951e-01,
        -8.86792243e-02,  1.27311677e-01,  2.10761577e-01,
        -1.29048228e-01],
       [ 7.76351988e-03,  1.02744631e-01,  3.15090306e-02,
        -4.22507524e-02, -7.05806985e-02, -3.36188599e-02,
        -5.58360517e-02,  3.31004933e-02,  1.67872787e-01,
         1.93050876e-02],
       [-1.02078155e-01,  2.26373672e-01,  2.00472884e-02,
         1.42875329e-01, -7.17821717e-02, -1.25433132e-01,
        -1.29005507e-01,  1.49560899e-01,  1.90905079e-01,
        -3.63839269e-02],
       [ 1.47245415e-02,  1.58656806e-01, -1.26874819e-02,
        -7.66854286e-02, -9.06889513e-02, -7.92975575e-02,
        -6.42474741e-02,  4.74841222e-02,  7.65139684e-02,
         6.85687177e-04],
       [ 4.51994911e-02,  1.02747172e-01,  5.90402409e-02,
        -7.47959688e-02,  4.96143363e-02, -9.78395492e-02,
        -1.41313836e-01,  1.87023636e-02,  1.15323000e-01,
        -1.95882767e-02],
       [ 1.96868703e-02,  2.02846974e-01, -2.27833651e-02,
         1.86173245e-04, -3.25496458e-02,  1.62277017e-02,
        -1.00922853e-01,  4.77700643e-02,  1.36663869e-01,
         1.79991350e-02],
       [ 5.75853288e-02,  1.11270092e-01,  1.75522473e-02,
        -1.09388068e-01, -6.82325140e-02, -4.87204120e-02,
        -1.43801153e-01,  1.30225658e-01,  1.56155527e-01,
        -6.01995736e-02],
       [ 9.60775614e-02,  1.57157868e-01,  6.47018999e-02,
         1.19081080e-01,  7.17374459e-02, -7.87666291e-02,
        -2.63294786e-01,  7.91654587e-02,  3.28489721e-01,
        -1.33296698e-01],
       [ 6.85669482e-03,  2.02329680e-01,  2.01770756e-02,
         4.75402288e-02,  1.81483310e-02, -8.72230753e-02,
        -1.42602280e-01,  6.43210635e-02,  1.97952151e-01,
        -3.46460342e-02],
       [-2.94741169e-02,  1.55780315e-01,  1.13556623e-01,
         5.01884446e-02, -3.80071886e-02, -5.65574840e-02,
        -1.12567842e-01,  1.04181089e-01,  1.87329590e-01,
         6.91847689e-03],
       [-4.18622680e-02,  1.58375412e-01,  3.81502807e-02,
         5.97982295e-03, -9.49430019e-02, -4.54636477e-02,
        -1.11451246e-01,  8.49653110e-02,  8.36150423e-02,
         8.93755108e-02],
       [ 3.06862947e-02,  1.72348678e-01,  9.92359817e-02,
         5.96035011e-02, -6.21923730e-02, -9.72464606e-02,
        -1.35570750e-01,  1.47248074e-01,  2.18581021e-01,
        -7.61078820e-02],
       [-1.66283399e-02,  1.03310168e-01, -7.60983303e-02,
        -8.80818665e-02, -1.59935392e-02,  3.20504196e-02,
        -1.37168974e-01,  4.09932993e-02,  1.68013752e-01,
        -7.46039003e-02],
       [-4.22940552e-02,  7.73958191e-02, -8.64608586e-02,
        -5.32893538e-02,  6.62778504e-03,  5.67031503e-02,
        -1.17372319e-01,  1.89076755e-02,  1.39656439e-01,
        -5.75286411e-02],
       [-2.57388018e-02,  1.41284227e-01, -6.42066821e-03,
        -7.95038976e-03, -1.07218988e-01, -7.95090571e-03,
        -1.27808541e-01,  7.36352205e-02,  1.83282763e-01,
         3.09703965e-02],
       [-3.10196560e-02,  1.63340852e-01,  6.47591203e-02,
         7.38482922e-03, -9.05630440e-02, -1.12060264e-01,
        -1.04869828e-01,  6.96965158e-02,  1.10843606e-01,
        -1.58963874e-02],
       [-1.52345374e-02,  2.47057572e-01, -1.33972540e-02,
        -1.24259163e-02, -3.95368859e-02, -3.18902731e-03,
        -1.34796560e-01,  5.57082444e-02,  1.80549741e-01,
        -1.13317169e-01],
       [-6.33373260e-02,  6.38159364e-02,  5.95373586e-02,
         4.84509021e-02, -4.67426851e-02, -9.99980271e-02,
        -4.65501957e-02,  6.25585467e-02,  6.21779487e-02,
         1.07775964e-02],
       [ 5.03122155e-03,  4.14561182e-02, -8.20717812e-02,
        -1.15092449e-01, -8.54139328e-02,  8.28133225e-02,
        -8.70619565e-02,  7.14041889e-02,  1.06555507e-01,
        -2.73677856e-02],
       [-6.44947868e-03,  9.40587074e-02, -4.35585156e-03,
        -3.52312550e-02, -1.44042492e-01, -4.58701849e-02,
        -1.01913810e-01,  3.08159925e-02,  1.84693485e-01,
        -1.40837990e-02],
       [-5.49621955e-02,  2.16621071e-01,  4.31699082e-02,
         6.06351420e-02, -1.40091740e-02, -1.46296114e-01,
        -8.33364725e-02, -4.18678373e-02,  2.03705639e-01,
        -1.14150174e-01],
       [ 6.10458627e-02,  1.54973388e-01,  1.78212583e-01,
        -3.47680487e-02, -2.19892710e-03, -1.00686066e-01,
        -1.71352297e-01,  1.37742549e-01,  2.49486402e-01,
        -6.42471313e-02],
       [-1.69411767e-02,  1.21699058e-01,  2.15908382e-02,
        -1.07041225e-02, -4.06461023e-03, -4.62356769e-03,
        -7.95360357e-02,  8.68357569e-02,  9.08724964e-02,
        -5.08894548e-02],
       [ 5.38422391e-02,  2.08934546e-01,  1.39210224e-01,
        -1.21861860e-01, -2.73534209e-02, -9.50130373e-02,
        -1.39309213e-01,  1.90146521e-01,  2.48442054e-01,
         4.97337617e-02],
       [ 9.16660354e-02,  1.75111696e-01,  7.74581507e-02,
        -2.79641468e-02,  4.08788063e-02,  4.57810983e-02,
        -1.90450072e-01,  1.36543363e-01,  2.15242535e-01,
        -5.02233654e-02],
       [ 5.11803254e-02,  1.38904184e-01,  3.64554152e-02,
         3.85325029e-03, -3.22512165e-03, -6.83613047e-02,
        -1.26513928e-01,  2.15092525e-02,  2.26410791e-01,
        -6.31421506e-02],
       [ 2.85983533e-02,  2.41258264e-01,  1.85026929e-01,
        -2.96444446e-03,  3.99214067e-02, -5.23987226e-02,
        -2.15593770e-01,  8.54111910e-02,  2.49003023e-01,
        -1.48870125e-02],
       [-1.00216540e-02,  1.62608847e-01,  3.72381099e-02,
        -9.84139144e-02,  6.68443069e-02,  1.87482312e-02,
        -1.50913715e-01,  8.64184275e-02,  1.24148369e-01,
         2.65729520e-02],
       [ 2.47569894e-03,  1.21445023e-01, -1.15900226e-02,
        -9.33258682e-02, -3.28714773e-03,  5.30058220e-02,
        -6.12404235e-02,  4.95514497e-02,  1.08226925e-01,
        -4.59813029e-02],
       [-1.94684733e-02,  2.13346109e-01,  1.78326145e-01,
        -2.32890546e-02, -7.18969628e-02, -1.22951575e-01,
        -1.05140626e-01,  1.21695854e-01,  1.65704101e-01,
         6.50989413e-02],
       [ 2.51138024e-02,  3.04784834e-01,  5.47760129e-02,
        -5.72455935e-02, -2.51652263e-02, -1.21025503e-01,
        -5.97484261e-02,  9.26740244e-02,  2.30540872e-01,
        -1.14799649e-01],
       [ 1.29877944e-02,  2.19184190e-01,  9.76317450e-02,
        -3.20439711e-02,  1.10066086e-02, -7.66418129e-02,
        -1.92894965e-01, -8.93116556e-03,  3.10473531e-01,
        -4.95561548e-02],
       [ 8.10609162e-02,  9.06044990e-02,  7.65926018e-03,
        -1.02456562e-01,  9.14674252e-04,  3.32038105e-02,
        -1.88666746e-01,  1.01246804e-01,  2.52590120e-01,
        -1.81731209e-02],
       [ 1.48571981e-02,  1.42202199e-01,  1.60281956e-02,
        -4.59313057e-02,  2.19650939e-02,  2.89188102e-02,
        -1.34737328e-01,  5.33377379e-02,  1.80350885e-01,
        -6.13923594e-02],
       [ 4.01482806e-02,  1.19301461e-01,  5.77976480e-02,
        -2.24377923e-02, -8.86567757e-02, -7.64372796e-02,
        -9.61347222e-02,  7.57460743e-02,  2.05856413e-01,
        -8.88336450e-02],
       [ 3.77295315e-02,  1.78821176e-01,  1.82416718e-02,
        -6.36871010e-02,  3.29270735e-02, -2.35350933e-02,
        -1.45507708e-01, -2.76538394e-02,  1.04764730e-01,
        -5.65764606e-02],
       [-9.85018760e-02,  1.60887152e-01,  6.79300949e-02,
        -2.30057687e-02, -4.40039821e-02, -8.39103907e-02,
        -7.57744685e-02,  6.82344586e-02,  1.37754112e-01,
         5.46135940e-02],
       [ 2.97224745e-02,  1.20612055e-01,  1.20417764e-02,
        -2.63920929e-02, -6.27740622e-02, -4.69992459e-02,
        -9.77188721e-03,  1.07425943e-01,  1.51656613e-01,
        -5.43105304e-02],
       [-7.05843866e-02,  1.49552494e-01,  1.30520388e-01,
         3.00201960e-02, -2.46050637e-02, -9.17982310e-02,
        -6.11761436e-02,  1.03005514e-01,  1.44697875e-01,
        -2.69553773e-02],
       [-8.84405226e-02,  7.51078576e-02, -2.60419808e-02,
         3.71940844e-02, -5.19317761e-02, -1.44130170e-01,
        -2.98225097e-02,  3.88874374e-02,  1.17280141e-01,
        -8.50341097e-02],
       [-5.43271564e-03,  1.32635787e-01,  1.92878321e-02,
        -1.04510412e-03,  2.51788646e-04, -1.08755842e-01,
        -1.19017228e-01,  2.65988968e-02,  1.33978993e-01,
        -4.94232923e-02],
       [-3.45436260e-02,  2.22251266e-01,  8.29519778e-02,
        -1.43463984e-02, -7.08704591e-02, -6.57155365e-02,
        -9.67732668e-02,  5.88787496e-02,  2.04536900e-01,
        -3.77801247e-03],
       [ 4.07302938e-03,  7.60394633e-02,  3.51320878e-02,
        -5.50988168e-02, -2.73768008e-02, -7.98538327e-04,
        -1.19044185e-01,  1.33575305e-01,  1.00421369e-01,
        -5.76152094e-03],
       [-7.07646683e-02,  9.57449079e-02,  3.01320292e-03,
        -6.80444762e-04, -6.12036139e-02,  1.61749795e-02,
        -1.11430295e-01,  5.04390895e-02,  1.27263844e-01,
         4.15940471e-02],
       [-6.62088990e-02,  1.67911723e-01, -7.03158975e-03,
         2.92589031e-02, -1.71539560e-03,  3.80636044e-02,
        -2.07398579e-01,  2.08681114e-02,  2.95723379e-01,
        -1.05123408e-01],
       [-7.76285864e-03,  1.76333100e-01,  4.54749167e-02,
        -4.14522737e-02, -1.44791529e-02, -1.71015300e-02,
        -1.02634914e-01, -4.71196175e-02,  1.55619234e-01,
        -5.79654649e-02],
       [-4.85768504e-02,  1.25082672e-01, -7.83841163e-02,
         1.39879566e-02, -7.60875121e-02, -8.24386254e-02,
        -1.00482151e-01, -2.67726462e-02,  1.85551196e-01,
        -5.32446355e-02],
       [ 2.10616384e-02,  2.37181753e-01, -1.76526997e-02,
         7.49899894e-02,  2.72666030e-02,  4.22630422e-02,
        -1.57592922e-01,  5.39703146e-02,  2.38428444e-01,
         1.14365909e-02]], dtype=float32)>,
  2: <tf.Tensor: shape=(64, 10), dtype=float32, numpy=
array([[ 4.23353165e-03,  2.19429165e-01,  7.53794834e-02,
         2.45986581e-02, -6.43368810e-04, -7.02050030e-02,
        -2.26921588e-01,  4.79703657e-02,  3.83701444e-01,
        -1.15323782e-01],
       [ 4.41423506e-02,  1.86596870e-01,  1.15628935e-01,
         1.66568421e-02,  4.88863476e-02, -9.24666449e-02,
        -2.83173442e-01,  5.69647029e-02,  2.23101228e-01,
        -3.31032947e-02],
       [ 8.43046792e-03,  1.52329683e-01,  5.78757748e-03,
        -4.28486615e-03, -1.43526673e-01, -1.14153042e-01,
        -9.92261171e-02,  8.98710936e-02,  7.79080391e-02,
        -6.49112836e-03],
       [-8.76600482e-03,  1.04169957e-01, -3.42051685e-02,
         6.84448108e-02, -4.33910973e-02, -1.38581872e-01,
        -6.99446946e-02, -9.57091525e-03,  1.68957561e-01,
        -5.90229519e-02],
       [-4.75772098e-02,  1.89211190e-01, -1.56542398e-02,
        -8.58705565e-02, -4.61695008e-02,  3.55218835e-02,
        -1.30670726e-01, -1.13697037e-01,  1.77915603e-01,
        -8.06551576e-02],
       [ 6.50522187e-02,  1.08218044e-01,  5.94794005e-03,
        -6.55616075e-02, -8.73896629e-02, -1.25450641e-01,
        -7.06865266e-02,  4.93889078e-02,  3.16153437e-01,
        -2.21551061e-01],
       [ 1.07295522e-02,  2.01098531e-01,  3.79708856e-02,
         2.82964669e-03,  8.30378234e-02,  4.56698947e-02,
        -2.11368039e-01,  5.58796450e-02,  1.87343612e-01,
         1.36559121e-02],
       [-2.60705240e-02,  1.69547886e-01,  5.85702807e-03,
        -1.43859208e-01, -6.86230734e-02,  4.60425615e-02,
        -1.29350215e-01,  1.23127684e-01,  1.99464023e-01,
        -5.12196645e-02],
       [ 1.39056509e-02,  1.04868963e-01,  1.47121530e-02,
        -5.46082109e-02,  2.14852840e-02,  6.35755807e-03,
        -9.78106856e-02,  3.06603052e-02,  1.74686119e-01,
        -5.20884842e-02],
       [-3.98043506e-02,  1.72519132e-01, -2.97855474e-02,
        -1.39703080e-02, -4.65966985e-02,  1.86847318e-02,
        -1.05183475e-01,  3.81214321e-02,  1.39025614e-01,
        -3.04390155e-02],
       [ 7.93213919e-02,  1.98414817e-01,  7.66369626e-02,
        -9.88458171e-02, -5.76191768e-02,  4.81012762e-02,
        -2.13780344e-01,  1.25994861e-01,  2.83011377e-01,
        -1.94945969e-02],
       [-2.33356394e-02,  9.45300087e-02,  4.31584343e-02,
         1.24615364e-01,  3.47287022e-02, -1.72311261e-01,
        -7.05944896e-02,  1.00849025e-01,  7.32044354e-02,
        -4.98517528e-02],
       [-9.38243121e-02,  1.92521960e-01, -3.03162709e-02,
         1.09136045e-01,  1.82714127e-03, -9.38178450e-02,
        -1.35059357e-01,  6.55748174e-02,  2.39582166e-01,
        -6.86705858e-02],
       [ 1.33360267e-01,  2.45250165e-01,  8.01844671e-02,
        -7.64622688e-02,  3.88587266e-02, -2.35206224e-02,
        -1.58269182e-01,  3.24046277e-02,  3.18062514e-01,
        -9.83042270e-02],
       [-8.25133175e-05,  1.47567347e-01,  4.71407324e-02,
        -1.62810743e-01, -8.17321688e-02, -9.92074311e-02,
        -1.60183907e-01,  4.33003679e-02,  3.57412994e-01,
        -8.30434337e-02],
       [-1.15686655e-03,  5.29526100e-02,  1.22352511e-01,
         3.62312943e-02, -1.04135104e-01, -2.08122417e-01,
        -2.09843040e-01,  1.07500866e-01,  1.95098162e-01,
        -1.38964383e-02],
       [ 1.71976089e-02,  2.15891868e-01,  9.32491720e-02,
         2.54981946e-02, -8.97111371e-03,  1.05213001e-02,
        -2.62108475e-01,  7.19682649e-02,  2.87705421e-01,
        -3.91284712e-02],
       [-1.45298429e-02,  3.04204345e-01,  8.38314667e-02,
         6.37851506e-02, -2.25916877e-02, -6.92082494e-02,
        -7.80591071e-02, -7.02549517e-02,  3.17538351e-01,
        -1.45598933e-01],
       [-2.09516250e-02,  1.99757129e-01,  7.38593191e-02,
        -3.58060747e-02,  7.43004605e-02, -3.08888871e-03,
        -1.71285793e-01,  5.43872267e-03,  1.95685238e-01,
        -1.08210847e-01],
       [-6.80500418e-02,  1.29983127e-01, -2.00017542e-02,
         6.89993352e-02,  1.46736782e-02,  3.05273104e-03,
        -1.43128991e-01,  3.00785080e-02,  1.98594868e-01,
        -7.79670775e-02],
       [ 8.29070807e-04,  2.03757584e-01,  1.47695050e-01,
         1.32804774e-02, -6.38608485e-02, -1.60322905e-01,
        -9.40738618e-02,  1.71452045e-01,  8.40619057e-02,
        -2.19734237e-02],
       [-2.04206556e-02,  6.88397139e-02, -3.80939320e-02,
        -1.80689804e-02, -6.56629130e-02,  2.03236938e-04,
        -8.77643675e-02,  7.50942342e-03,  1.55496746e-01,
        -1.95804611e-03],
       [-7.52407610e-02,  2.32457221e-01,  3.75885423e-03,
         4.35544923e-02,  1.68880001e-02, -2.74819564e-02,
        -6.39640391e-02,  2.42256522e-02,  2.40844280e-01,
        -3.07186469e-02],
       [ 4.92042825e-02,  1.08363181e-01,  2.92570218e-02,
        -8.24578777e-02, -2.03533359e-02, -1.46888345e-02,
        -1.49468422e-01,  8.94391611e-02,  1.72042742e-01,
        -1.13167390e-01],
       [ 2.73372885e-02,  2.70915836e-01, -1.61972791e-02,
         3.01552564e-03,  1.58145040e-01,  1.35540329e-02,
        -1.13364786e-01,  4.41422649e-02,  2.91186363e-01,
        -6.28393739e-02],
       [-6.94563761e-02,  2.43825436e-01,  9.17027071e-02,
         1.67384744e-02,  2.63669901e-02,  4.26052138e-03,
        -1.81753308e-01,  4.39109504e-02,  2.40723640e-01,
        -1.95452720e-02],
       [-9.50974151e-02,  2.29937285e-01, -5.45122623e-02,
         7.63761476e-02, -4.97216731e-03, -9.60827023e-02,
        -1.32286817e-01, -5.43217212e-02,  1.98420033e-01,
        -6.71695322e-02],
       [ 2.77190544e-02,  9.84679386e-02,  5.24443313e-02,
        -5.80006801e-02, -2.06942052e-01, -1.86396837e-01,
        -1.33182526e-01,  7.66159743e-02,  1.56382859e-01,
        -6.19281977e-02],
       [-2.90177241e-02,  1.63510352e-01,  3.76849696e-02,
         1.53242126e-02, -8.46907943e-02, -1.20821081e-01,
        -1.35381490e-01,  4.04320732e-02,  2.49732584e-01,
        -7.89577961e-02],
       [-4.70377989e-02,  1.61872312e-01,  7.86993653e-02,
        -4.10371609e-02, -6.79323971e-02, -1.18859224e-01,
        -6.31960630e-02,  3.29575595e-03,  1.17849685e-01,
        -1.59357525e-02],
       [-1.86057892e-02,  1.12739854e-01,  6.65416420e-02,
         1.87671185e-02, -1.00663781e-01, -9.13313627e-02,
        -9.60671976e-02,  7.97155946e-02,  1.72669441e-01,
        -1.92776956e-02],
       [ 4.19124924e-02,  1.65502220e-01, -4.37223278e-02,
        -3.11724525e-02, -1.12637505e-01,  1.46318488e-02,
        -1.90364629e-01,  5.95347509e-02,  1.89262718e-01,
        -3.52844633e-02],
       [ 4.46186736e-02,  1.83328554e-01,  5.01643792e-02,
        -9.69503671e-02, -3.76255438e-02,  3.01548503e-02,
        -1.37750596e-01,  6.04476444e-02,  3.78602147e-01,
        -1.59119517e-02],
       [-1.62080489e-02,  2.49174491e-01,  1.81561857e-01,
        -5.90500832e-02, -8.76032114e-02, -1.88865185e-01,
        -1.54307842e-01,  1.23958245e-01,  2.33525962e-01,
         1.13899186e-02],
       [ 1.43617727e-02,  2.65568141e-02, -3.30348015e-02,
        -5.88107482e-03, -9.83627886e-02, -9.21784863e-02,
        -1.13128453e-01,  7.12900311e-02,  1.43237799e-01,
        -9.58989263e-02],
       [-7.48540312e-02,  3.33712518e-01,  1.29996210e-01,
        -1.67746656e-02,  6.65022135e-02, -1.62993014e-01,
        -9.25260782e-02,  7.71736279e-02,  1.86228618e-01,
        -6.45315945e-02],
       [-2.43965629e-02,  1.91430315e-01,  1.38168633e-02,
         1.77609678e-02,  1.54505316e-02, -5.33148274e-02,
        -8.09239596e-02,  2.04952247e-02,  1.54871970e-01,
         4.72241864e-02],
       [-4.63329852e-02,  1.71960846e-01,  2.11439319e-02,
        -6.81277290e-02, -9.06952471e-03, -9.98454913e-02,
        -3.34286541e-02,  8.88922624e-03,  9.27755833e-02,
        -3.00760027e-02],
       [ 4.23479788e-02,  1.35589108e-01,  6.71340898e-02,
         1.09096002e-02,  1.24553561e-01,  2.98427790e-03,
        -2.12802306e-01,  7.62363821e-02,  2.66440749e-01,
         1.83387548e-02],
       [-3.16384286e-02,  1.40153408e-01,  2.92114615e-02,
        -5.64130358e-02, -2.74407361e-02, -1.19771220e-01,
        -9.17053968e-02,  7.24797621e-02,  2.13803470e-01,
        -5.73051423e-02],
       [-5.44042774e-02,  1.71544194e-01, -3.12393326e-02,
        -2.45009549e-04, -5.35970628e-02,  1.86596215e-02,
        -9.18636173e-02,  1.32760227e-01,  1.23096913e-01,
         8.87709856e-02],
       [-7.39924237e-02,  1.51298881e-01,  9.55246910e-02,
        -4.86468524e-02, -7.60765746e-02, -1.61874816e-01,
        -6.14959002e-02,  5.18087260e-02,  1.05947107e-01,
        -8.82040411e-02],
       [ 7.68203065e-02,  2.11034268e-01,  1.80352136e-01,
        -6.30000755e-02,  6.20452240e-02,  2.18753051e-02,
        -2.84894109e-01,  9.57936496e-02,  3.41363311e-01,
        -6.91152364e-03],
       [-3.54628712e-02,  2.20552549e-01,  5.99607006e-02,
        -1.83227118e-02,  5.77846244e-02,  1.14818811e-02,
        -1.26347676e-01,  5.36554456e-02,  2.19297066e-01,
        -2.65512597e-02],
       [ 4.44263965e-02,  2.46232197e-01,  1.77771315e-01,
        -5.49985282e-02,  1.57039016e-02, -6.27193078e-02,
        -2.90955067e-01,  7.95380920e-02,  2.94240713e-01,
        -6.11903444e-02],
       [-4.06831205e-02,  1.10754915e-01,  3.22520696e-02,
        -3.99801359e-02, -1.79050118e-03,  1.07940212e-02,
        -1.16739094e-01,  1.10837512e-01,  1.00016974e-01,
         2.98836865e-02],
       [ 8.73708725e-03,  1.02017261e-01,  1.69472564e-02,
        -4.72939909e-02, -7.47955590e-03,  1.42747350e-03,
        -5.64500391e-02,  4.87349965e-02,  7.42409676e-02,
        -5.47729433e-02],
       [-1.95538998e-03,  6.15996681e-02,  1.48042008e-01,
         4.29837070e-02, -1.17542312e-01, -1.45712182e-01,
        -1.55243710e-01,  2.41285086e-01,  2.18631193e-01,
        -9.68923643e-02],
       [-7.84188136e-03,  1.81978106e-01,  6.58245534e-02,
         1.96035802e-02,  3.81405726e-02, -2.47198585e-02,
        -1.58635020e-01,  1.71689354e-02,  2.78587341e-01,
        -1.03017859e-01],
       [ 8.07365924e-02,  6.40842915e-02, -2.19914708e-02,
        -7.64466748e-02, -6.19298704e-02,  4.66884151e-02,
        -7.25084543e-02,  4.81080525e-02,  1.51125476e-01,
         6.72580767e-03],
       [ 9.88107920e-03,  2.09446564e-01,  9.70154479e-02,
        -6.43669963e-02,  9.10496265e-02,  3.06141153e-02,
        -1.01100609e-01,  1.25666708e-01,  1.60253912e-01,
         9.81407762e-02],
       [-5.57728633e-02,  2.48839647e-01,  1.60720870e-02,
         3.28887776e-02,  3.11441533e-02, -3.09692416e-03,
        -5.00264168e-02,  1.10664908e-02,  2.92645961e-01,
        -5.97475618e-02],
       [ 8.95608589e-03,  1.90755725e-01,  4.60669100e-02,
        -1.08656697e-02,  5.95278256e-02,  2.57844180e-02,
        -1.24027409e-01,  3.66638005e-02,  2.61418462e-01,
        -7.04593509e-02],
       [ 5.80448322e-02,  2.59952843e-01,  1.65673405e-01,
        -1.45062536e-01,  8.73191729e-02,  6.21002279e-02,
        -2.11110681e-01,  9.61699411e-02,  3.28424424e-01,
         2.48666666e-03],
       [-6.99144602e-02,  1.79993302e-01, -6.53717369e-02,
        -6.43312857e-02, -3.61087993e-02,  3.26435640e-03,
        -1.17737278e-01, -6.19659610e-02,  1.05373316e-01,
        -6.55258447e-03],
       [-3.17209028e-02,  2.31664747e-01,  8.40400904e-02,
         9.36513096e-02,  5.04623502e-02, -5.28940931e-02,
        -1.50818974e-01,  1.87106393e-02,  3.13219249e-01,
        -1.15464114e-01],
       [-5.05768433e-02,  1.41759455e-01, -8.60382169e-02,
         6.48893416e-02, -9.69148502e-02, -9.07014497e-03,
        -3.41951475e-02,  1.27999231e-01,  9.80906636e-02,
         5.91018125e-02],
       [ 2.29298230e-02,  1.97304890e-01,  7.13324621e-02,
         5.21846563e-02,  7.88156465e-02, -4.15080898e-02,
        -2.00769201e-01,  8.62245560e-02,  2.50551701e-01,
        -6.13706745e-03],
       [-3.79884951e-02,  9.45786536e-02, -5.99553436e-02,
        -2.65337452e-02, -3.35649587e-02,  4.87349182e-02,
        -1.01456560e-01,  1.33285642e-01,  1.73420295e-01,
        -3.04428712e-02],
       [ 2.38169171e-02,  1.84180871e-01, -3.98734771e-03,
        -4.00915593e-02, -5.07884175e-02, -2.86837034e-02,
        -7.88934082e-02,  7.65142217e-02,  8.98543894e-02,
         3.62954885e-02],
       [-2.43979320e-03,  2.17790484e-01,  1.60010867e-02,
         1.00949481e-01,  6.46233410e-02, -5.72200939e-02,
        -7.56584555e-02, -9.88177955e-03,  2.77333081e-01,
        -5.74062765e-02],
       [ 1.20138802e-01,  2.00258493e-01,  1.28349274e-01,
         1.72331557e-02,  3.07638794e-02, -5.79431280e-02,
        -2.64105439e-01,  1.30845457e-01,  2.19183877e-01,
        -3.08909453e-02],
       [-5.54150045e-02,  3.13869476e-01,  4.98976894e-02,
        -3.14421654e-02,  4.29599881e-02, -1.50245372e-02,
        -1.41285732e-01,  2.81823054e-03,  2.12937444e-01,
         6.43140376e-02],
       [-1.51820704e-02,  1.14439577e-01,  4.43667360e-02,
        -3.33241224e-02, -2.01379843e-02, -9.65436697e-02,
        -8.24083835e-02,  4.75531183e-02,  5.79859428e-02,
        -1.08510163e-03]], dtype=float32)>,
  3: <tf.Tensor: shape=(64, 10), dtype=float32, numpy=
array([[-3.40513252e-02,  2.03432530e-01,  9.09892097e-03,
        -5.41643053e-03,  1.57991685e-02,  1.21032596e-02,
        -1.32470325e-01, -2.76158042e-02,  2.26646349e-01,
        -5.32625876e-02],
       [-4.44330201e-02,  1.16029665e-01, -3.25100496e-02,
        -4.59222943e-02, -6.42499998e-02, -1.05387241e-01,
        -8.53930339e-02, -8.09976831e-03,  1.02004990e-01,
        -3.61828506e-03],
       [ 4.34364378e-03,  7.87097216e-02, -5.80844888e-03,
        -7.25608319e-02, -7.01370835e-02, -1.42627824e-02,
        -1.10803604e-01,  1.03347003e-04,  1.24046251e-01,
        -3.41620333e-02],
       [-2.58671883e-02,  1.51582867e-01, -3.45316976e-02,
        -7.71447038e-03, -5.83367236e-02, -1.31875679e-01,
        -4.56085503e-02,  5.24146967e-02,  1.14952639e-01,
        -1.69872083e-02],
       [-5.85075729e-02,  1.53781176e-01,  5.96344192e-03,
         1.71167832e-02, -6.00284040e-02, -5.34819029e-02,
        -2.62061395e-02,  6.84171319e-02,  1.10373184e-01,
         2.20724978e-02],
       [ 2.65283380e-02,  1.13072328e-01, -4.36350591e-02,
        -2.41191518e-02, -4.74594012e-02,  1.24394279e-02,
        -1.13469273e-01, -4.77567986e-02,  2.04410464e-01,
        -9.23336148e-02],
       [ 4.25236486e-03,  1.44701064e-01,  2.12332234e-03,
        -3.17419395e-02,  3.17246057e-02,  2.16693487e-02,
        -1.22194193e-01,  4.73719276e-03,  1.72928661e-01,
        -1.01616539e-01],
       [ 3.30143049e-02,  2.16983318e-01,  1.62638947e-02,
        -9.66768153e-03,  1.11548007e-01, -9.99417529e-03,
        -1.34229332e-01, -5.04460111e-02,  2.79251873e-01,
        -1.84175938e-01],
       [-1.02409767e-02,  1.58570945e-01, -1.03466623e-02,
        -1.18219554e-02,  1.14681423e-02, -4.19491157e-02,
        -1.47966594e-01,  6.98860064e-02,  3.15910906e-01,
        -1.12162240e-01],
       [ 4.17925343e-02,  4.44437638e-02,  5.07905222e-02,
        -8.75820816e-02, -1.07881941e-01, -8.59192908e-02,
        -1.46444440e-01,  5.89657575e-02,  1.07248537e-01,
        -1.16428975e-02],
       [-5.90813309e-02,  1.30890906e-01,  1.61162466e-01,
        -8.93002748e-02, -1.21183045e-01, -1.84952527e-01,
        -8.37956220e-02,  1.44084319e-01,  7.32512772e-02,
        -7.28712380e-02],
       [ 7.81649798e-02,  1.35432139e-01,  5.31063192e-02,
        -1.56863242e-01, -1.57377779e-01, -1.17934614e-01,
        -2.29851902e-01,  7.47646838e-02,  1.62038431e-01,
        -7.05562234e-02],
       [-6.62714466e-02,  1.79212421e-01, -1.01665594e-02,
        -5.37098199e-03, -3.17934006e-02, -5.96741438e-02,
        -9.80869159e-02, -2.70618200e-02,  1.62996516e-01,
        -7.38259554e-02],
       [-2.38184575e-02,  2.21280128e-01,  3.23753953e-02,
         8.20405036e-02, -3.63970697e-02, -9.26501676e-02,
        -1.09931938e-01,  9.84082967e-02,  9.76619944e-02,
         2.00645290e-02],
       [-8.73076469e-02,  2.01646030e-01,  6.55627549e-02,
        -5.66465370e-02,  6.38230816e-02, -8.34926218e-03,
        -5.45629226e-02,  3.76609415e-02,  2.14777961e-01,
        -1.55443409e-02],
       [-8.07584822e-02,  9.77620631e-02, -4.48914506e-02,
        -1.31609375e-02, -5.85701391e-02, -3.24000046e-02,
        -4.04567942e-02,  8.14849585e-02,  3.42880934e-02,
         5.28021567e-02],
       [ 3.74265737e-03,  9.25663337e-02,  2.79061664e-02,
        -9.33633894e-02, -4.55673747e-02,  4.58937325e-03,
        -2.24390730e-01,  3.79213355e-02,  2.46288016e-01,
        -7.65644014e-02],
       [-4.28052433e-02,  1.25388548e-01, -5.36506586e-02,
         4.19551581e-02, -5.48698083e-02,  2.24072486e-02,
        -7.23879486e-02,  1.12429492e-01,  8.06107894e-02,
         3.94152664e-02],
       [ 3.80781777e-02,  2.66987145e-01,  1.20090947e-01,
        -2.38447972e-02,  5.45060970e-02, -2.58837193e-02,
        -2.41362497e-01,  1.17511913e-01,  3.80392313e-01,
        -8.80946666e-02],
       [-3.56831551e-02,  1.46232873e-01,  1.75030619e-01,
        -2.08549313e-02, -5.19318655e-02, -1.10070243e-01,
        -7.10558593e-02,  1.85665846e-01,  1.49913579e-01,
        -1.02060579e-01],
       [-1.16539989e-02,  2.48771071e-01,  1.18956946e-01,
         6.95284456e-03,  6.26556575e-02, -7.15247020e-02,
        -1.79050326e-01, -1.14738196e-02,  3.48924100e-01,
        -1.15605146e-01],
       [-3.06925066e-02,  2.27895916e-01,  1.10462591e-01,
         3.10414471e-02, -3.06169242e-02, -1.50359854e-01,
        -8.70063677e-02,  6.80769384e-02,  2.18787462e-01,
        -7.85525367e-02],
       [ 9.96686146e-03,  1.33869857e-01,  2.82010809e-03,
        -8.60224068e-02,  2.16623619e-02, -3.16254050e-02,
        -9.25265029e-02,  2.29261406e-02,  1.98859021e-01,
        -6.05149120e-02],
       [-1.30155478e-02,  1.11529469e-01, -8.52188021e-02,
        -8.71111527e-02, -5.36377616e-02, -3.98793332e-02,
        -7.76846707e-03,  6.36536106e-02,  4.24531400e-02,
         4.03935127e-02],
       [ 5.74102066e-03,  1.68508664e-01,  8.54348242e-02,
        -1.16821863e-01, -1.24219753e-01, -6.17609210e-02,
        -1.03389099e-01,  1.10803045e-01,  1.93267390e-01,
         2.87177674e-02],
       [ 1.34942988e-02,  2.38560170e-01,  1.22002952e-01,
        -1.39869675e-02,  8.26200396e-02, -5.06992675e-02,
        -1.50667310e-01,  1.54531896e-02,  2.65271425e-01,
        -8.08498636e-02],
       [-6.66068122e-02,  1.00112498e-01, -1.46900397e-02,
         8.76351725e-03, -3.74154896e-02, -8.42793286e-03,
        -4.37183678e-02,  7.63720721e-02,  1.20657958e-01,
         1.89001691e-02],
       [ 2.12806603e-03,  1.69854864e-01,  2.34670341e-02,
        -1.39371321e-01,  6.04183748e-02, -5.77827208e-02,
        -6.59138262e-02,  1.02533046e-02,  2.39347517e-01,
        -6.26288429e-02],
       [-9.99415070e-02,  1.58963621e-01,  6.35408610e-03,
        -2.38869116e-02,  3.10510565e-02, -2.91099679e-02,
        -1.25350773e-01, -1.65705886e-02,  2.57465035e-01,
        -1.28831580e-01],
       [-1.17913865e-01,  2.55737275e-01,  1.86348796e-01,
         1.44126788e-02, -6.15354888e-02, -9.40977037e-02,
        -1.29600331e-01,  1.12521902e-01,  1.99698865e-01,
         1.74097363e-02],
       [-1.23679549e-01,  2.57537484e-01, -1.22430399e-02,
        -4.01171222e-02, -7.18829781e-03, -5.78734316e-02,
         7.32966699e-03,  5.02083451e-04,  9.86155644e-02,
        -7.23083988e-02],
       [ 1.08309630e-02,  3.81909907e-02,  4.71137539e-02,
         6.55897781e-02, -8.87667537e-02, -1.41841307e-01,
        -7.48016238e-02,  1.02927163e-01,  2.13758171e-01,
        -1.26174286e-01],
       [-6.71771094e-02,  2.01039553e-01,  1.72368884e-01,
         1.59556568e-02, -4.68469262e-02, -1.38723403e-01,
        -1.16192989e-01,  7.54091144e-02,  1.25622511e-01,
        -3.14225033e-02],
       [-6.58558682e-02,  1.74065858e-01,  1.86652452e-01,
        -1.70559101e-02, -8.69284943e-02, -1.39214307e-01,
        -1.17776766e-01,  2.65370011e-01,  1.55964494e-01,
        -6.84146956e-02],
       [-8.22177231e-02,  2.27130473e-01,  1.27159178e-01,
        -1.49869740e-01, -8.74796659e-02, -1.41086638e-01,
        -1.34076215e-02,  9.26791877e-02,  5.33009320e-02,
        -6.50859997e-02],
       [ 2.26671435e-02,  1.09015264e-01,  5.42712025e-03,
        -1.23692922e-01, -1.10174052e-01, -1.24601535e-02,
        -6.10625967e-02,  9.29438472e-02,  1.15395218e-01,
         3.79673988e-02],
       [ 2.96583772e-02,  1.35619849e-01,  3.37549187e-02,
        -1.07223108e-01, -6.07961155e-02, -2.76672915e-02,
        -5.46414554e-02,  2.58955229e-02,  6.58891946e-02,
         2.00137477e-02],
       [ 6.49473295e-02,  8.24081227e-02,  1.09407846e-02,
        -4.45530377e-02, -4.29118834e-02, -1.18217625e-01,
        -3.59588861e-02,  4.75448817e-02,  1.43996641e-01,
        -6.47661313e-02],
       [-4.37766574e-02,  1.23372138e-01,  3.52954865e-02,
        -1.60119638e-01,  4.05133702e-02,  8.01405963e-03,
        -1.05917051e-01,  2.15808898e-02,  1.95030466e-01,
        -2.03043632e-02],
       [-2.13629156e-02,  1.80437163e-01,  8.77011865e-02,
        -5.23369387e-03, -7.08544105e-02, -1.32909566e-01,
        -1.78817958e-01,  7.01954067e-02,  1.14490062e-01,
        -5.87962940e-02],
       [-2.37873476e-02,  1.58835292e-01, -4.09351960e-02,
        -2.67066620e-02, -5.46892248e-02,  4.55831140e-02,
        -1.07746445e-01,  7.68242627e-02,  1.31297380e-01,
        -2.10610498e-02],
       [ 1.49304494e-02,  7.97875822e-02, -2.15204656e-02,
         5.17515391e-02, -3.11227590e-02, -7.93518871e-02,
        -4.56585884e-02,  4.73341085e-02,  1.13580532e-01,
        -4.56545502e-02],
       [ 6.10301457e-03,  1.49135202e-01, -1.62375160e-03,
        -1.21447153e-01, -5.97906485e-02, -4.20331135e-02,
        -9.01303440e-02,  1.80470776e-02,  7.02371150e-02,
        -1.06106466e-02],
       [ 7.17720091e-02,  1.73415780e-01,  1.61077574e-01,
         4.73868623e-02,  3.88062261e-02, -1.12214953e-01,
        -2.47786194e-01,  1.14480481e-01,  2.03277022e-01,
        -1.37052909e-02],
       [ 2.26206556e-02,  1.82275891e-01,  9.43721831e-03,
         5.75621575e-02,  2.37296335e-02, -6.37802035e-02,
        -1.06670640e-01,  1.34590194e-01,  3.55376929e-01,
        -1.73436061e-01],
       [-7.15724379e-02,  1.99982524e-01, -1.29134022e-02,
        -1.66657001e-01, -1.01888716e-01, -8.33422691e-02,
        -1.04193240e-01,  5.61448634e-02, -4.19500247e-02,
        -2.07699835e-04],
       [-1.99338309e-02,  1.79406077e-01, -9.77844559e-03,
        -7.63786510e-02, -9.17157605e-02, -6.58254176e-02,
        -1.06738336e-01,  7.06022084e-02,  1.94230378e-01,
         1.33451857e-02],
       [-1.66671816e-02, -1.04731619e-02,  2.50794590e-02,
         1.26084294e-02, -6.77415133e-02, -1.05476260e-01,
        -1.26577273e-01,  1.21548533e-01,  2.53283262e-01,
        -9.98395383e-02],
       [ 1.12005100e-01,  6.68410510e-02,  3.49344090e-02,
        -3.85726318e-02, -1.00318134e-01, -7.25245103e-02,
        -1.61754340e-01,  1.24220312e-01,  2.19939888e-01,
        -1.20532416e-01],
       [-7.79906511e-02,  3.28637898e-01,  2.01819614e-02,
         7.67050460e-02,  1.26973748e-01, -8.29249099e-02,
        -1.19551025e-01,  3.96031104e-02,  2.34470412e-01,
        -4.55087125e-02],
       [-7.10161775e-03,  6.71651363e-02,  1.23315439e-01,
         6.03847206e-02, -4.37426344e-02, -7.74070695e-02,
        -1.17905945e-01,  2.06869707e-01,  2.45417476e-01,
        -1.13937207e-01],
       [-2.72866059e-02,  1.19134001e-01, -5.84903657e-02,
        -4.21878509e-02, -1.85396802e-02,  3.39659341e-02,
        -1.05989635e-01,  2.05431189e-02,  1.13681965e-01,
        -2.45663468e-02],
       [ 9.59841087e-02,  2.03809738e-01,  8.21996629e-02,
        -4.61001694e-03, -2.41664518e-02,  1.62050612e-02,
        -1.68202400e-01,  7.98271894e-02,  3.21698397e-01,
        -2.35490613e-02],
       [-6.05810434e-04,  1.33921996e-01,  3.05980239e-02,
        -7.15356618e-02,  4.84495163e-02,  3.33615132e-02,
        -1.01266295e-01,  4.20148782e-02,  1.49674490e-01,
        -6.27710149e-02],
       [-3.37799080e-02,  1.53237894e-01,  2.29513273e-02,
        -3.71082425e-02, -6.43590242e-02, -8.73020589e-02,
        -3.38607356e-02,  5.36565408e-02,  9.51215476e-02,
        -3.32198739e-02],
       [ 5.20607978e-02,  6.31053597e-02, -2.14972980e-02,
        -8.00713673e-02, -8.75142068e-02, -9.10174847e-02,
        -8.12271237e-02,  8.62855017e-02,  6.78770840e-02,
        -2.02045459e-02],
       [-4.88964580e-02,  1.54221833e-01, -7.81740323e-02,
         1.16735101e-02,  5.48713431e-02,  4.39455472e-02,
        -7.86736757e-02,  9.06214491e-02,  1.79706812e-01,
        -5.27878329e-02],
       [-8.97183083e-04,  2.20044464e-01,  2.75967531e-02,
        -7.91532993e-02, -2.54691057e-02, -2.56281905e-02,
        -1.40711069e-01, -8.02882314e-02,  1.69499636e-01,
        -4.30632383e-02],
       [ 3.37363780e-02,  6.17061965e-02,  1.25457510e-01,
        -2.71759257e-02, -7.46657699e-02, -1.27983809e-01,
        -1.33206740e-01,  9.27011743e-02,  1.70535952e-01,
        -4.40608673e-02],
       [-2.23535784e-02,  2.15669826e-01,  7.96924308e-02,
         1.32489707e-02,  2.90064383e-02,  1.90435499e-02,
        -2.53891677e-01,  5.45614399e-03,  2.59311318e-01,
        -4.82472703e-02],
       [ 1.82506610e-02,  4.59257066e-02, -3.42016667e-02,
        -4.10056375e-02, -2.24843830e-01, -8.49857777e-02,
        -5.51653691e-02,  1.11465648e-01,  1.43880963e-01,
         8.07211176e-03],
       [ 3.33122909e-02,  2.13925928e-01,  6.84463903e-02,
        -9.78758410e-02, -8.65304992e-02,  3.38434987e-03,
        -1.81885287e-01,  1.11009680e-01,  1.30676568e-01,
         6.56051114e-02],
       [ 2.53316164e-02,  2.83516347e-01,  6.08082861e-02,
         7.44206458e-02,  1.02712810e-01, -4.44115885e-03,
        -2.29838431e-01,  3.90066579e-03,  2.26752907e-01,
        -1.09509565e-02],
       [ 2.91731209e-04,  9.73333344e-02, -5.23693264e-02,
        -9.83056426e-02, -1.43108189e-01, -3.65451537e-02,
        -1.48899078e-01,  3.92167568e-02,  1.61674380e-01,
        -3.28264721e-02]], dtype=float32)>
} }
2023-12-07 02:55:59.111538: W tensorflow/core/kernels/data/cache_dataset_ops.cc:858] The calling iterator did not fully read the dataset being cached. In order to avoid unexpected truncation of the dataset, the partially cached contents of the dataset  will be discarded. This can happen if you have an input pipeline similar to `dataset.cache().take(k).repeat()`. You should use `dataset.take(k).cache().repeat()` instead.

Calling the restored function is just a forward pass on the saved model (tf.keras.Model.predict). What if you want to continue training the loaded function? Or what if you need to embed the loaded function into a bigger model? A common practice is to wrap this loaded object into a Keras layer to achieve this. Luckily, TF Hub has hub.KerasLayer for this purpose, shown here:

import tensorflow_hub as hub

def build_model(loaded):
  x = tf.keras.layers.Input(shape=(28, 28, 1), name='input_x')
  # Wrap what's loaded to a KerasLayer
  keras_layer = hub.KerasLayer(loaded, trainable=True)(x)
  model = tf.keras.Model(x, keras_layer)
  return model

another_strategy = tf.distribute.MirroredStrategy()
with another_strategy.scope():
  loaded = tf.saved_model.load(saved_model_path)
  model = build_model(loaded)

  model.compile(loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
                optimizer=tf.keras.optimizers.Adam(),
                metrics=[tf.metrics.SparseCategoricalAccuracy()])
  model.fit(train_dataset, epochs=2)
INFO:tensorflow:Using MirroredStrategy with devices ('/job:localhost/replica:0/task:0/device:GPU:0', '/job:localhost/replica:0/task:0/device:GPU:1', '/job:localhost/replica:0/task:0/device:GPU:2', '/job:localhost/replica:0/task:0/device:GPU:3')
INFO:tensorflow:Using MirroredStrategy with devices ('/job:localhost/replica:0/task:0/device:GPU:0', '/job:localhost/replica:0/task:0/device:GPU:1', '/job:localhost/replica:0/task:0/device:GPU:2', '/job:localhost/replica:0/task:0/device:GPU:3')
2023-12-07 02:55:59.963736: W tensorflow/core/grappler/optimizers/data/auto_shard.cc:553] The `assert_cardinality` transformation is currently not handled by the auto-shard rewrite and will be removed.
Epoch 1/2
INFO:tensorflow:Collective all_reduce tensors: 6 all_reduces, num_devices = 4, group_size = 4, implementation = CommunicationImplementation.NCCL, num_packs = 1
INFO:tensorflow:Collective all_reduce tensors: 6 all_reduces, num_devices = 4, group_size = 4, implementation = CommunicationImplementation.NCCL, num_packs = 1
INFO:tensorflow:Collective all_reduce tensors: 6 all_reduces, num_devices = 4, group_size = 4, implementation = CommunicationImplementation.NCCL, num_packs = 1
INFO:tensorflow:Collective all_reduce tensors: 6 all_reduces, num_devices = 4, group_size = 4, implementation = CommunicationImplementation.NCCL, num_packs = 1
235/235 [==============================] - 5s 7ms/step - loss: 0.3494 - sparse_categorical_accuracy: 0.9011
Epoch 2/2
235/235 [==============================] - 2s 7ms/step - loss: 0.1117 - sparse_categorical_accuracy: 0.9681

In the above example, Tensorflow Hub's hub.KerasLayer wraps the result loaded back from tf.saved_model.load into a Keras layer that is used to build another model. This is very useful for transfer learning.

Which API should I use?

For saving, if you are working with a Keras model, use the Keras Model.save API unless you need the additional control allowed by the low-level API. If what you are saving is not a Keras model, then the lower-level API, tf.saved_model.save, is your only choice.

For loading, your API choice depends on what you want to get from the model loading API. If you cannot (or do not want to) get a Keras model, then use tf.saved_model.load. Otherwise, use tf.keras.models.load_model. Note that you can get a Keras model back only if you saved a Keras model.

It is possible to mix and match the APIs. You can save a Keras model with Model.save, and load a non-Keras model with the low-level API, tf.saved_model.load.

model = get_model()

# Saving the model using Keras `Model.save`
model.save(saved_model_path)

another_strategy = tf.distribute.MirroredStrategy()
# Loading the model using the lower-level API
with another_strategy.scope():
  loaded = tf.saved_model.load(saved_model_path)
INFO:tensorflow:Assets written to: /tmp/tf_save/assets
INFO:tensorflow:Assets written to: /tmp/tf_save/assets
INFO:tensorflow:Using MirroredStrategy with devices ('/job:localhost/replica:0/task:0/device:GPU:0', '/job:localhost/replica:0/task:0/device:GPU:1', '/job:localhost/replica:0/task:0/device:GPU:2', '/job:localhost/replica:0/task:0/device:GPU:3')
INFO:tensorflow:Using MirroredStrategy with devices ('/job:localhost/replica:0/task:0/device:GPU:0', '/job:localhost/replica:0/task:0/device:GPU:1', '/job:localhost/replica:0/task:0/device:GPU:2', '/job:localhost/replica:0/task:0/device:GPU:3')

Saving/Loading from a local device

When saving and loading from a local I/O device while training on remote devices—for example, when using a Cloud TPU—you must use the option experimental_io_device in tf.saved_model.SaveOptions and tf.saved_model.LoadOptions to set the I/O device to localhost. For example:

model = get_model()

# Saving the model to a path on localhost.
saved_model_path = '/tmp/tf_save'
save_options = tf.saved_model.SaveOptions(experimental_io_device='/job:localhost')
model.save(saved_model_path, options=save_options)

# Loading the model from a path on localhost.
another_strategy = tf.distribute.MirroredStrategy()
with another_strategy.scope():
  load_options = tf.saved_model.LoadOptions(experimental_io_device='/job:localhost')
  loaded = tf.keras.models.load_model(saved_model_path, options=load_options)
INFO:tensorflow:Assets written to: /tmp/tf_save/assets
INFO:tensorflow:Assets written to: /tmp/tf_save/assets
INFO:tensorflow:Using MirroredStrategy with devices ('/job:localhost/replica:0/task:0/device:GPU:0', '/job:localhost/replica:0/task:0/device:GPU:1', '/job:localhost/replica:0/task:0/device:GPU:2', '/job:localhost/replica:0/task:0/device:GPU:3')
INFO:tensorflow:Using MirroredStrategy with devices ('/job:localhost/replica:0/task:0/device:GPU:0', '/job:localhost/replica:0/task:0/device:GPU:1', '/job:localhost/replica:0/task:0/device:GPU:2', '/job:localhost/replica:0/task:0/device:GPU:3')

Caveats

One special case is when you create Keras models in certain ways, and then save them before training. For example:

class SubclassedModel(tf.keras.Model):
  """Example model defined by subclassing `tf.keras.Model`."""

  output_name = 'output_layer'

  def __init__(self):
    super(SubclassedModel, self).__init__()
    self._dense_layer = tf.keras.layers.Dense(
        5, dtype=tf.dtypes.float32, name=self.output_name)

  def call(self, inputs):
    return self._dense_layer(inputs)

my_model = SubclassedModel()
try:
  my_model.save(saved_model_path)
except ValueError as e:
  print(f'{type(e).__name__}: ', *e.args)
WARNING:tensorflow:Skipping full serialization of Keras layer <__main__.SubclassedModel object at 0x7f0a3b891370>, because it is not built.
WARNING:tensorflow:Skipping full serialization of Keras layer <__main__.SubclassedModel object at 0x7f0a3b891370>, because it is not built.
ValueError:  Model <__main__.SubclassedModel object at 0x7f0a3b891370> cannot be saved either because the input shape is not available or because the forward pass of the model is not defined.To define a forward pass, please override `Model.call()`. To specify an input shape, either call `build(input_shape)` directly, or call the model on actual data using `Model()`, `Model.fit()`, or `Model.predict()`. If you have a custom training step, please make sure to invoke the forward pass in train step through `Model.__call__`, i.e. `model(inputs)`, as opposed to `model.call()`.

A SavedModel saves the tf.types.experimental.ConcreteFunction objects generated when you trace a tf.function (check When is a Function tracing? in the Introduction to graphs and tf.function guide to learn more). If you get a ValueError like this it's because Model.save was not able to find or create a traced ConcreteFunction.

tf.saved_model.save(my_model, saved_model_path)
x = tf.saved_model.load(saved_model_path)
x.signatures
WARNING:tensorflow:Skipping full serialization of Keras layer <__main__.SubclassedModel object at 0x7f0a3b891370>, because it is not built.
WARNING:tensorflow:Skipping full serialization of Keras layer <__main__.SubclassedModel object at 0x7f0a3b891370>, because it is not built.
WARNING:tensorflow:Skipping full serialization of Keras layer <keras.src.layers.core.dense.Dense object at 0x7f0a56aa35e0>, because it is not built.
WARNING:tensorflow:Skipping full serialization of Keras layer <keras.src.layers.core.dense.Dense object at 0x7f0a56aa35e0>, because it is not built.
INFO:tensorflow:Assets written to: /tmp/tf_save/assets
INFO:tensorflow:Assets written to: /tmp/tf_save/assets
_SignatureMap({})

Usually the model's forward pass—the call method—will be traced automatically when the model is called for the first time, often via the Keras Model.fit method. A ConcreteFunction can also be generated by the Keras Sequential and Functional APIs, if you set the input shape, for example, by making the first layer either a tf.keras.layers.InputLayer or another layer type, and passing it the input_shape keyword argument.

To verify if your model has any traced ConcreteFunctions, check if Model.save_spec is None:

print(my_model.save_spec() is None)
True

Let's use tf.keras.Model.fit to train the model, and notice that the save_spec gets defined and model saving will work:

BATCH_SIZE_PER_REPLICA = 4
BATCH_SIZE = BATCH_SIZE_PER_REPLICA * mirrored_strategy.num_replicas_in_sync

dataset_size = 100
dataset = tf.data.Dataset.from_tensors(
    (tf.range(5, dtype=tf.float32), tf.range(5, dtype=tf.float32))
    ).repeat(dataset_size).batch(BATCH_SIZE)

my_model.compile(optimizer='adam', loss='mean_squared_error')
my_model.fit(dataset, epochs=2)

print(my_model.save_spec() is None)
my_model.save(saved_model_path)
Epoch 1/2
7/7 [==============================] - 1s 3ms/step - loss: 14.9597
Epoch 2/2
7/7 [==============================] - 0s 2ms/step - loss: 14.5677
False
INFO:tensorflow:Unsupported signature for serialization: ((TensorSpec(shape=(5, 5), dtype=tf.float32, name='gradient'), <tensorflow.python.framework.func_graph.UnknownArgument object at 0x7f0a3b832ca0>, 139681799552240), {}).
INFO:tensorflow:Unsupported signature for serialization: ((TensorSpec(shape=(5, 5), dtype=tf.float32, name='gradient'), <tensorflow.python.framework.func_graph.UnknownArgument object at 0x7f0a3b832ca0>, 139681799552240), {}).
INFO:tensorflow:Unsupported signature for serialization: ((TensorSpec(shape=(5,), dtype=tf.float32, name='gradient'), <tensorflow.python.framework.func_graph.UnknownArgument object at 0x7f0a3b80d850>, 139681924288656), {}).
INFO:tensorflow:Unsupported signature for serialization: ((TensorSpec(shape=(5,), dtype=tf.float32, name='gradient'), <tensorflow.python.framework.func_graph.UnknownArgument object at 0x7f0a3b80d850>, 139681924288656), {}).
INFO:tensorflow:Unsupported signature for serialization: ((TensorSpec(shape=(5, 5), dtype=tf.float32, name='gradient'), <tensorflow.python.framework.func_graph.UnknownArgument object at 0x7f0a3b832ca0>, 139681799552240), {}).
INFO:tensorflow:Unsupported signature for serialization: ((TensorSpec(shape=(5, 5), dtype=tf.float32, name='gradient'), <tensorflow.python.framework.func_graph.UnknownArgument object at 0x7f0a3b832ca0>, 139681799552240), {}).
INFO:tensorflow:Unsupported signature for serialization: ((TensorSpec(shape=(5,), dtype=tf.float32, name='gradient'), <tensorflow.python.framework.func_graph.UnknownArgument object at 0x7f0a3b80d850>, 139681924288656), {}).
INFO:tensorflow:Unsupported signature for serialization: ((TensorSpec(shape=(5,), dtype=tf.float32, name='gradient'), <tensorflow.python.framework.func_graph.UnknownArgument object at 0x7f0a3b80d850>, 139681924288656), {}).
INFO:tensorflow:Assets written to: /tmp/tf_save/assets
INFO:tensorflow:Assets written to: /tmp/tf_save/assets