Missed TensorFlow World? Check out the recap. Learn more

Load NumPy data

View on TensorFlow.org Run in Google Colab View source on GitHub Download notebook

This tutorial provides an example of loading data from NumPy arrays into a tf.data.Dataset.

This example loads the MNIST dataset from a .npz file. However, the source of the NumPy arrays is not important.

Setup

from __future__ import absolute_import, division, print_function, unicode_literals
 
import numpy as np
import tensorflow as tf

Load from .npz file

DATA_URL = 'https://storage.googleapis.com/tensorflow/tf-keras-datasets/mnist.npz'

path = tf.keras.utils.get_file('mnist.npz', DATA_URL)
with np.load(path) as data:
  train_examples = data['x_train']
  train_labels = data['y_train']
  test_examples = data['x_test']
  test_labels = data['y_test']

Load NumPy arrays with tf.data.Dataset

Assuming you have an array of examples and a corresponding array of labels, pass the two arrays as a tuple into tf.data.Dataset.from_tensor_slices to create a tf.data.Dataset.

train_dataset = tf.data.Dataset.from_tensor_slices((train_examples, train_labels))
test_dataset = tf.data.Dataset.from_tensor_slices((test_examples, test_labels))

Use the datasets

Shuffle and batch the datasets

BATCH_SIZE = 64
SHUFFLE_BUFFER_SIZE = 100

train_dataset = train_dataset.shuffle(SHUFFLE_BUFFER_SIZE).batch(BATCH_SIZE)
test_dataset = test_dataset.batch(BATCH_SIZE)

Build and train a model

model = tf.keras.Sequential([
    tf.keras.layers.Flatten(input_shape=(28, 28)),
    tf.keras.layers.Dense(128, activation='relu'),
    tf.keras.layers.Dense(10, activation='softmax')
])

model.compile(optimizer=tf.keras.optimizers.RMSprop(),
                loss=tf.keras.losses.SparseCategoricalCrossentropy(),
                metrics=[tf.keras.metrics.SparseCategoricalAccuracy()])
model.fit(train_dataset, epochs=10)
Epoch 1/10
938/938 [==============================] - 4s 4ms/step - loss: 3.5160 - sparse_categorical_accuracy: 0.8780
Epoch 2/10
938/938 [==============================] - 2s 2ms/step - loss: 0.5325 - sparse_categorical_accuracy: 0.9291
Epoch 3/10
938/938 [==============================] - 2s 2ms/step - loss: 0.3963 - sparse_categorical_accuracy: 0.9459
Epoch 4/10
938/938 [==============================] - 2s 2ms/step - loss: 0.3349 - sparse_categorical_accuracy: 0.9555
Epoch 5/10
938/938 [==============================] - 2s 2ms/step - loss: 0.2991 - sparse_categorical_accuracy: 0.9609
Epoch 6/10
938/938 [==============================] - 2s 2ms/step - loss: 0.2837 - sparse_categorical_accuracy: 0.9643
Epoch 7/10
938/938 [==============================] - 2s 2ms/step - loss: 0.2512 - sparse_categorical_accuracy: 0.9675
Epoch 8/10
938/938 [==============================] - 2s 2ms/step - loss: 0.2379 - sparse_categorical_accuracy: 0.9701
Epoch 9/10
938/938 [==============================] - 2s 2ms/step - loss: 0.2259 - sparse_categorical_accuracy: 0.9726
Epoch 10/10
938/938 [==============================] - 2s 2ms/step - loss: 0.2051 - sparse_categorical_accuracy: 0.9743

<tensorflow.python.keras.callbacks.History at 0x7f96b4ae30b8>
model.evaluate(test_dataset)
157/157 [==============================] - 0s 2ms/step - loss: 0.4985 - sparse_categorical_accuracy: 0.9615

[0.4984860098452034, 0.9615]