질문이있다? TensorFlow 포럼 방문 포럼 에서 커뮤니티와 연결

TensorFlow 애드온 콜백: TimeStopping

TensorFlow.org에서 보기 Google Colab에서 실행하기 GitHub에서 소스 보기 노트북 다운로드하기

개요

이 노트북은 TensorFlow 애드온에서 TimeStopping 콜백을 사용하는 방법을 보여줍니다.

설정

pip install -q -U tensorflow-addons
import tensorflow_addons as tfa

from tensorflow.keras.datasets import mnist
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout, Flatten

데이터 가져오기 및 정규화

# the data, split between train and test sets
(x_train, y_train), (x_test, y_test) = mnist.load_data()
# normalize data
x_train, x_test = x_train / 255.0, x_test / 255.0

간단한 MNIST CNN 모델 빌드하기

# build the model using the Sequential API
model = Sequential()
model.add(Flatten(input_shape=(28, 28)))
model.add(Dense(128, activation='relu'))
model.add(Dropout(0.2))
model.add(Dense(10, activation='softmax'))

model.compile(optimizer='adam',
              loss = 'sparse_categorical_crossentropy',
              metrics=['accuracy'])

간단한 TimeStopping 사용법

# initialize TimeStopping callback 
time_stopping_callback = tfa.callbacks.TimeStopping(seconds=5, verbose=1)

# train the model with tqdm_callback
# make sure to set verbose = 0 to disable
# the default progress bar.
model.fit(x_train, y_train,
          batch_size=64,
          epochs=100,
          callbacks=[time_stopping_callback],
          validation_data=(x_test, y_test))
Epoch 1/100
938/938 [==============================] - 2s 2ms/step - loss: 0.3393 - accuracy: 0.9019 - val_loss: 0.1657 - val_accuracy: 0.9516
Epoch 2/100
938/938 [==============================] - 2s 2ms/step - loss: 0.1658 - accuracy: 0.9514 - val_loss: 0.1171 - val_accuracy: 0.9670
Epoch 3/100
938/938 [==============================] - 2s 2ms/step - loss: 0.1220 - accuracy: 0.9636 - val_loss: 0.0951 - val_accuracy: 0.9723
Timed stopping at epoch 3 after training for 0:00:05
<tensorflow.python.keras.callbacks.History at 0x7ff8b4a0a7b8>