RSVP for your your local TensorFlow Everywhere event today!

Azure blob storage with TensorFlow

View on Run in Google Colab View source on GitHub Download notebook


This tutorial shows how to use read and write files on Azure Blob Storage with TensorFlow, through TensorFlow IO's Azure file system integration.

An Azure storage account is needed to read and write files on Azure Blob Storage. The Azure Storage Key should be provided through environmental variable:

os.environ['TF_AZURE_STORAGE_KEY'] = '<key>'

The storage account name and container name are part of the filename uri:


In this tutorial, for demo purposes you can optionally setup Azurite which is a Azure Storage emulator. With Azurite emulator it is possible to read and write files through Azure blob storage interface with TensorFlow.

Setup and usage

Install required packages, and restart runtime

  %tensorflow_version 2.x 
except Exception:

!pip install tensorflow-io

Install and setup Azurite (optional)

In case an Azure Storage Account is not available, the following is needed to install and setup Azurite that emulates the Azure Storage interface:

npm install azurite@2.7.0
[K[?25hnpm WARN deprecated request@2.87.0: request has been deprecated, see
[K[?25hnpm WARN saveError ENOENT: no such file or directory, open '/content/package.json'
npm notice created a lockfile as package-lock.json. You should commit this file.
npm WARN enoent ENOENT: no such file or directory, open '/content/package.json'
npm WARN content No description
npm WARN content No repository field.
npm WARN content No README data
npm WARN content No license field.

+ azurite@2.7.0
added 116 packages from 141 contributors in 6.591s

# The path for npm might not be exposed in PATH env,
# you can find it out through 'npm bin' command
npm_bin_path = get_ipython().getoutput('npm bin')[0]
print('npm bin path: ', npm_bin_path)

# Run `azurite-blob -s` as a background process. 
# IPython doesn't recognize `&` in inline bash cells.
get_ipython().system_raw(npm_bin_path + '/' + 'azurite-blob -s &')
npm bin path:  /content/node_modules/.bin

Read and write files to Azure Storage with TensorFlow

The following is an example of reading and writing files to Azure Storage with TensorFlow's API.

It behaves the same way as other file systems (e.g., POSIX or GCS) in TensorFlow once tensorflow-io package is imported, as tensorflow-io will automatically register azfs scheme for use.

The Azure Storage Key should be provided through TF_AZURE_STORAGE_KEY environmental variable. Otherwise TF_AZURE_USE_DEV_STORAGE could be set to True to use Azurite emulator instead:

import os
import tensorflow as tf
import tensorflow_io as tfio

# Switch to False to use Azure Storage instead:
use_emulator = True

if use_emulator:
  os.environ['TF_AZURE_USE_DEV_STORAGE'] = '1'
  account_name = 'devstoreaccount1'
  # Replace <key> with Azure Storage Key, and <account> with Azure Storage Account
  os.environ['TF_AZURE_STORAGE_KEY'] = '<key>'
  account_name = '<account>'
pathname = 'az://{}/aztest'.format(account_name)

filename = pathname + '/hello.txt'
with, mode='w') as w:
  w.write("Hello, world!")

with, mode='r') as r:
Hello, world!


Configurations of Azure Blob Storage in TensorFlow are always done through environmental variables. Below is a complete list of available configurations:

  • TF_AZURE_USE_DEV_STORAGE: Set to 1 to use local development storage emulator for connections like 'az://devstoreaccount1/container/file.txt'. This will take precendence over all other settings so unset to use any other connection
  • TF_AZURE_STORAGE_KEY: Account key for the storage account in use
  • TF_AZURE_STORAGE_USE_HTTP: Set to any value if you don't want to use https transfer. unset to use default of https
  • TF_AZURE_STORAGE_BLOB_ENDPOINT: Set to the endpoint of blob storage - default is