本頁面由 Cloud Translation API 翻譯而成。
Switch to English

在筆記本中使用TensorBoard

在TensorFlow.org上查看 在Google Colab中運行 在GitHub上查看源代碼

TensorBoard可以直接在筆記本體驗中使用,例如ColabJupyter 。這對於共享結果,將TensorBoard集成到現有工作流程中以及使用TensorBoard而不在本地安裝任何東西可能會有所幫助。

建立

首先安裝TF 2.0並加載TensorBoard筆記本擴展:

對於Jupyter用戶:如果您已經將Jupyter和TensorBoard安裝在同一virtualenv中,那麼您應該很好。如果您使用的是更複雜的設置,例如全局Jupyter安裝和用於不同Conda / virtualenv環境的內核,則必須確保tensorboard二進製文件位於Jupyter筆記本上下文內的PATH 。一種實現方法是修改kernel_spec ,使環境的bin目錄位於PATH前面, 如此處所述

如果您使用TensorFlow的每晚運行Jupyter Notebook服務器Docker映像,則不僅要公開筆記本的端口,還要公開TensorBoard的端口。

因此,使用以下命令運行容器:

 docker run -it -p 8888:8888 -p 6006:6006 \
tensorflow/tensorflow:nightly-py3-jupyter 
 

其中-p 6006是TensorBoard的默認端口。這將為您分配一個端口以運行一個TensorBoard實例。要具有並發實例,必須分配更多端口。

 # Load the TensorBoard notebook extension
%load_ext tensorboard
 

導入TensorFlow,日期時間和操作系統:

 import tensorflow as tf
import datetime, os
 

筆記本中的TensorBoard

下載FashionMNIST數據集並對其進行縮放:

 fashion_mnist = tf.keras.datasets.fashion_mnist

(x_train, y_train),(x_test, y_test) = fashion_mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0
 
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-labels-idx1-ubyte.gz
32768/29515 [=================================] - 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-images-idx3-ubyte.gz
26427392/26421880 [==============================] - 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-labels-idx1-ubyte.gz
8192/5148 [===============================================] - 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-images-idx3-ubyte.gz
4423680/4422102 [==============================] - 0s 0us/step

創建一個非常簡單的模型:

 def create_model():
  return tf.keras.models.Sequential([
    tf.keras.layers.Flatten(input_shape=(28, 28)),
    tf.keras.layers.Dense(512, activation='relu'),
    tf.keras.layers.Dropout(0.2),
    tf.keras.layers.Dense(10, activation='softmax')
  ])
 

使用Keras和TensorBoard回調訓練模型:

 def train_model():
  
  model = create_model()
  model.compile(optimizer='adam',
                loss='sparse_categorical_crossentropy',
                metrics=['accuracy'])

  logdir = os.path.join("logs", datetime.datetime.now().strftime("%Y%m%d-%H%M%S"))
  tensorboard_callback = tf.keras.callbacks.TensorBoard(logdir, histogram_freq=1)

  model.fit(x=x_train, 
            y=y_train, 
            epochs=5, 
            validation_data=(x_test, y_test), 
            callbacks=[tensorboard_callback])

train_model()
 
Train on 60000 samples, validate on 10000 samples
Epoch 1/5
60000/60000 [==============================] - 11s 182us/sample - loss: 0.4976 - accuracy: 0.8204 - val_loss: 0.4143 - val_accuracy: 0.8538
Epoch 2/5
60000/60000 [==============================] - 10s 174us/sample - loss: 0.3845 - accuracy: 0.8588 - val_loss: 0.3855 - val_accuracy: 0.8626
Epoch 3/5
60000/60000 [==============================] - 10s 175us/sample - loss: 0.3513 - accuracy: 0.8705 - val_loss: 0.3740 - val_accuracy: 0.8607
Epoch 4/5
60000/60000 [==============================] - 11s 177us/sample - loss: 0.3287 - accuracy: 0.8793 - val_loss: 0.3596 - val_accuracy: 0.8719
Epoch 5/5
60000/60000 [==============================] - 11s 178us/sample - loss: 0.3153 - accuracy: 0.8825 - val_loss: 0.3360 - val_accuracy: 0.8782

使用筆記本電腦內啟動TensorBoard 魔法

 %tensorboard --logdir logs
 

現在,您可以查看儀表板,例如標量,圖形,直方圖等。某些儀表板在Colab中尚不可用(例如配置文件插件)。

%tensorboard魔術與TensorBoard命令行調用的格式完全相同,但是前面帶有% -sign。

您還可以在培訓之前啟動TensorBoard以對其進行監視:

 %tensorboard --logdir logs
 

通過發出相同的命令,可以重用相同的TensorBoard後端。如果選擇了其他日誌目錄,則將打開TensorBoard的新實例。端口是自動管理的。

開始訓練新模型並觀看TensorBoard每30秒自動更新一次,或者使用右上方的按鈕對其進行刷新:

 train_model()
 
Train on 60000 samples, validate on 10000 samples
Epoch 1/5
60000/60000 [==============================] - 11s 184us/sample - loss: 0.4968 - accuracy: 0.8223 - val_loss: 0.4216 - val_accuracy: 0.8481
Epoch 2/5
60000/60000 [==============================] - 11s 176us/sample - loss: 0.3847 - accuracy: 0.8587 - val_loss: 0.4056 - val_accuracy: 0.8545
Epoch 3/5
60000/60000 [==============================] - 11s 176us/sample - loss: 0.3495 - accuracy: 0.8727 - val_loss: 0.3600 - val_accuracy: 0.8700
Epoch 4/5
60000/60000 [==============================] - 11s 179us/sample - loss: 0.3282 - accuracy: 0.8795 - val_loss: 0.3636 - val_accuracy: 0.8694
Epoch 5/5
60000/60000 [==============================] - 11s 176us/sample - loss: 0.3115 - accuracy: 0.8839 - val_loss: 0.3438 - val_accuracy: 0.8764

您可以使用tensorboard.notebook API進行更多控制:

 from tensorboard import notebook
notebook.list() # View open TensorBoard instances
 
Known TensorBoard instances:

  - port 6006: logdir logs (started 0:00:54 ago; pid 265)

 # Control TensorBoard display. If no port is provided, 
# the most recently launched TensorBoard is used
notebook.display(port=6006, height=1000)