MLコミュニティデーは11月9日です! TensorFlow、JAXからの更新のために私たちに参加し、より多くの詳細をご覧ください

不均衡なデータの分類

TensorFlow.orgで表示 GoogleColabで実行 GitHubでソースを表示 ノートブックをダウンロード

このチュートリアルでは、あるクラスの例の数が別のクラスの例の数を大幅に上回っている、非常に不均衡なデータセットを分類する方法を示します。あなたはで動作するクレジットカード詐欺検出のKaggle上でホストされているデータセット。目的は、合計284,807件のトランザクションからわずか492件の不正トランザクションを検出することです。あなたは使用するKerasをモデルと定義するクラスの重みをモデルが不均衡なデータから学ぶのを助けるために。 。

このチュートリアルには、次の完全なコードが含まれています。

  • パンダを使用してCSVファイルを読み込みます。
  • トレイン、検証、およびテストセットを作成します。
  • Kerasを使用してモデルを定義およびトレーニングします(クラスの重みの設定を含む)。
  • さまざまな指標(適合率と再現率を含む)を使用してモデルを評価します。
  • 次のような不均衡なデータを処理するための一般的な手法を試してください。
    • クラスの重み付け
    • オーバーサンプリング

設定

import tensorflow as tf
from tensorflow import keras

import os
import tempfile

import matplotlib as mpl
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
import seaborn as sns

import sklearn
from sklearn.metrics import confusion_matrix
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
2021-08-03 01:24:56.723367: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcudart.so.11.0
mpl.rcParams['figure.figsize'] = (12, 10)
colors = plt.rcParams['axes.prop_cycle'].by_key()['color']

データ処理と調査

Kaggleクレジットカード詐欺データセットをダウンロードする

Pandasは、構造化データを読み込んで操作するための多くの便利なユーティリティを備えたPythonライブラリです。パンダにCSVをダウンロードするために使用することができるデータフレーム

file = tf.keras.utils
raw_df = pd.read_csv('https://storage.googleapis.com/download.tensorflow.org/data/creditcard.csv')
raw_df.head()
raw_df[['Time', 'V1', 'V2', 'V3', 'V4', 'V5', 'V26', 'V27', 'V28', 'Amount', 'Class']].describe()

クラスラベルの不均衡を調べる

データセットの不均衡を見てみましょう。

neg, pos = np.bincount(raw_df['Class'])
total = neg + pos
print('Examples:\n    Total: {}\n    Positive: {} ({:.2f}% of total)\n'.format(
    total, pos, 100 * pos / total))
Examples:
    Total: 284807
    Positive: 492 (0.17% of total)

これは、陽性サンプルのごく一部を示しています。

データのクリーンアップ、分割、正規化

生データにはいくつかの問題があります。まず、 TimeAmountの列が直接使用するにはあまりにも可変です。ドロップTime列を(何を意味するのかはっきりしていないので)とのログ取るAmountの範囲を減らすために列を。

cleaned_df = raw_df.copy()

# You don't want the `Time` column.
cleaned_df.pop('Time')

# The `Amount` column covers a huge range. Convert to log-space.
eps = 0.001 # 0 => 0.1¢
cleaned_df['Log Ammount'] = np.log(cleaned_df.pop('Amount')+eps)

データセットをトレーニング、検証、テストのセットに分割します。検証セットは、モデルの適合中に損失とメトリックを評価するために使用されますが、モデルはこのデータに適合していません。テストセットは、トレーニングフェーズでは完全に使用されず、モデルが新しいデータにどの程度一般化されているかを評価するために最後にのみ使用されます。これは、不均衡なデータセットでは特に重要であるオーバーフィッティングは学習データの不足から、重要な関心事です。

# Use a utility from sklearn to split and shuffle your dataset.
train_df, test_df = train_test_split(cleaned_df, test_size=0.2)
train_df, val_df = train_test_split(train_df, test_size=0.2)

# Form np arrays of labels and features.
train_labels = np.array(train_df.pop('Class'))
bool_train_labels = train_labels != 0
val_labels = np.array(val_df.pop('Class'))
test_labels = np.array(test_df.pop('Class'))

train_features = np.array(train_df)
val_features = np.array(val_df)
test_features = np.array(test_df)

sklearnStandardScalerを使用して入力機能を正規化します。これにより、平均が0に設定され、標準偏差が1に設定されます。

scaler = StandardScaler()
train_features = scaler.fit_transform(train_features)

val_features = scaler.transform(val_features)
test_features = scaler.transform(test_features)

train_features = np.clip(train_features, -5, 5)
val_features = np.clip(val_features, -5, 5)
test_features = np.clip(test_features, -5, 5)


print('Training labels shape:', train_labels.shape)
print('Validation labels shape:', val_labels.shape)
print('Test labels shape:', test_labels.shape)

print('Training features shape:', train_features.shape)
print('Validation features shape:', val_features.shape)
print('Test features shape:', test_features.shape)
Training labels shape: (182276,)
Validation labels shape: (45569,)
Test labels shape: (56962,)
Training features shape: (182276, 29)
Validation features shape: (45569, 29)
Test features shape: (56962, 29)

データ分布を見てください

次に、いくつかの機能について、ポジティブな例とネガティブな例の分布を比較します。この時点で自問する良い質問は次のとおりです。

  • これらの分布は意味がありますか?
    • はい。あなたは、入力を正規化してきましたし、これらは主に集中している+/- 2の範囲。
  • 分布の違いがわかりますか?
    • はい、肯定的な例には、はるかに高い割合の極値が含まれています。
pos_df = pd.DataFrame(train_features[ bool_train_labels], columns=train_df.columns)
neg_df = pd.DataFrame(train_features[~bool_train_labels], columns=train_df.columns)

sns.jointplot(pos_df['V5'], pos_df['V6'],
              kind='hex', xlim=(-5,5), ylim=(-5,5))
plt.suptitle("Positive distribution")

sns.jointplot(neg_df['V5'], neg_df['V6'],
              kind='hex', xlim=(-5,5), ylim=(-5,5))
_ = plt.suptitle("Negative distribution")
/home/kbuilder/.local/lib/python3.7/site-packages/seaborn/_decorators.py:43: FutureWarning: Pass the following variables as keyword args: x, y. From version 0.12, the only valid positional argument will be `data`, and passing other arguments without an explicit keyword will result in an error or misinterpretation.
  FutureWarning
/home/kbuilder/.local/lib/python3.7/site-packages/seaborn/_decorators.py:43: FutureWarning: Pass the following variables as keyword args: x, y. From version 0.12, the only valid positional argument will be `data`, and passing other arguments without an explicit keyword will result in an error or misinterpretation.
  FutureWarning

png

png

モデルと指標を定義する

稠密接続隠された層、と簡単なニューラルネットワーク作成する関数を定義ドロップアウトオーバーフィッティングを低減するための層、および不正取引であることの確率を返し、出力シグモイド層を:

METRICS = [
      keras.metrics.TruePositives(name='tp'),
      keras.metrics.FalsePositives(name='fp'),
      keras.metrics.TrueNegatives(name='tn'),
      keras.metrics.FalseNegatives(name='fn'), 
      keras.metrics.BinaryAccuracy(name='accuracy'),
      keras.metrics.Precision(name='precision'),
      keras.metrics.Recall(name='recall'),
      keras.metrics.AUC(name='auc'),
      keras.metrics.AUC(name='prc', curve='PR'), # precision-recall curve
]

def make_model(metrics=METRICS, output_bias=None):
  if output_bias is not None:
    output_bias = tf.keras.initializers.Constant(output_bias)
  model = keras.Sequential([
      keras.layers.Dense(
          16, activation='relu',
          input_shape=(train_features.shape[-1],)),
      keras.layers.Dropout(0.5),
      keras.layers.Dense(1, activation='sigmoid',
                         bias_initializer=output_bias),
  ])

  model.compile(
      optimizer=keras.optimizers.Adam(learning_rate=1e-3),
      loss=keras.losses.BinaryCrossentropy(),
      metrics=metrics)

  return model
2021-08-03 01:25:08.381176: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcuda.so.1
2021-08-03 01:25:09.051368: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:937] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2021-08-03 01:25:09.052227: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1733] Found device 0 with properties: 
pciBusID: 0000:00:05.0 name: Tesla V100-SXM2-16GB computeCapability: 7.0
coreClock: 1.53GHz coreCount: 80 deviceMemorySize: 15.78GiB deviceMemoryBandwidth: 836.37GiB/s
2021-08-03 01:25:09.052256: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcudart.so.11.0
2021-08-03 01:25:09.054904: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcublas.so.11
2021-08-03 01:25:09.055006: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcublasLt.so.11
2021-08-03 01:25:09.055932: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcufft.so.10
2021-08-03 01:25:09.056258: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcurand.so.10
2021-08-03 01:25:09.056968: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcusolver.so.11
2021-08-03 01:25:09.057610: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcusparse.so.11
2021-08-03 01:25:09.057854: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcudnn.so.8
2021-08-03 01:25:09.057971: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:937] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2021-08-03 01:25:09.058971: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:937] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2021-08-03 01:25:09.059782: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1871] Adding visible gpu devices: 0
2021-08-03 01:25:09.060298: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2 AVX512F FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2021-08-03 01:25:09.060817: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:937] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2021-08-03 01:25:09.061728: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1733] Found device 0 with properties: 
pciBusID: 0000:00:05.0 name: Tesla V100-SXM2-16GB computeCapability: 7.0
coreClock: 1.53GHz coreCount: 80 deviceMemorySize: 15.78GiB deviceMemoryBandwidth: 836.37GiB/s
2021-08-03 01:25:09.061836: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:937] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2021-08-03 01:25:09.062826: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:937] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2021-08-03 01:25:09.063702: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1871] Adding visible gpu devices: 0
2021-08-03 01:25:09.063743: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcudart.so.11.0
2021-08-03 01:25:09.665505: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1258] Device interconnect StreamExecutor with strength 1 edge matrix:
2021-08-03 01:25:09.665544: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1264]      0 
2021-08-03 01:25:09.665551: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1277] 0:   N 
2021-08-03 01:25:09.665798: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:937] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2021-08-03 01:25:09.666692: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:937] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2021-08-03 01:25:09.667634: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:937] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2021-08-03 01:25:09.668458: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1418] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 14646 MB memory) -> physical GPU (device: 0, name: Tesla V100-SXM2-16GB, pci bus id: 0000:00:05.0, compute capability: 7.0)

有用な指標を理解する

パフォーマンスを評価するときに役立つモデルによって計算できる、上記で定義されたいくつかのメトリックがあることに注意してください。

  • 陰性と陽性が間違って分類されたサンプルです
  • 陰性と陽性が正しく分類されたサンプルです
  • 精度は例の割合が正しく分類である> $ \ FRAC {\ {テキスト真のサンプル}} {\ {テキスト総サンプル}} $
  • 精度が正しく分類された予測陽性の割合である> $ \ FRAC {\ {テキスト真陽性}} {\ {テキスト真の陽性+偽陽性}} $
  • リコールは、正しく分類された実際の陽性の割合> $ \ FRAC {\ {テキスト真陽性}} {\ {テキスト真の陽性+偽陰性}} $です
  • AUCは、受信者動作特性曲線(ROC-AUC)の曲線下面積を意味します。このメトリックは、分類子がランダムな正のサンプルをランダムな負のサンプルよりも高くランク付けする確率に等しくなります。
  • AUPRCは精密リコール曲線の曲線下面積を指します。このメトリックは、さまざまな確率しきい値の適合率と再現率のペアを計算します。

続きを読む:

ベースラインモデル

モデルを構築する

次に、前に定義した関数を使用してモデルを作成およびトレーニングします。モデルは、デフォルトのバッチサイズである2048よりも大きい値を使用して適合していることに注意してください。これは、各バッチにいくつかの陽性サンプルが含まれる可能性が十分にあることを確認するために重要です。バッチサイズが小さすぎると、不正なトランザクションから学ぶことができない可能性があります。

EPOCHS = 100
BATCH_SIZE = 2048

early_stopping = tf.keras.callbacks.EarlyStopping(
    monitor='val_prc', 
    verbose=1,
    patience=10,
    mode='max',
    restore_best_weights=True)
model = make_model()
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                (None, 16)                480       
_________________________________________________________________
dropout (Dropout)            (None, 16)                0         
_________________________________________________________________
dense_1 (Dense)              (None, 1)                 17        
=================================================================
Total params: 497
Trainable params: 497
Non-trainable params: 0
_________________________________________________________________

モデルをテスト実行します。

model.predict(train_features[:10])
2021-08-03 01:25:10.045861: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:176] None of the MLIR Optimization Passes are enabled (registered 2)
2021-08-03 01:25:10.046271: I tensorflow/core/platform/profile_utils/cpu_utils.cc:114] CPU Frequency: 2000179999 Hz
2021-08-03 01:25:10.102262: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcublas.so.11
2021-08-03 01:25:10.513763: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcublasLt.so.11
array([[0.6450613 ],
       [0.8092749 ],
       [0.7684952 ],
       [0.63029546],
       [0.6491195 ],
       [0.4928469 ],
       [0.38895515],
       [0.6993505 ],
       [0.60630167],
       [0.6519456 ]], dtype=float32)

オプション:正しい初期バイアスを設定します。

これらの最初の推測は素晴らしいものではありません。あなたはデータセットが不均衡であることを知っています。 (:参照してくださいことを反映するために、出力層のバイアスを設定しトレーニングニューラルネットワークのためのレシピ:「うまくinitのを」 )。これは、初期収束に役立ちます。

損失初期デフォルトバイアスが約あるべきであるとmath.log(2) = 0.69314

results = model.evaluate(train_features, train_labels, batch_size=BATCH_SIZE, verbose=0)
print("Loss: {:0.4f}".format(results[0]))
Loss: 1.1068

設定する正しいバイアスは、以下から導き出すことができます。

$$ p_0 = pos/(pos + neg) = 1/(1+e^{-b_0}) $$
$$ b_0 = -log_e(1/p_0 - 1) $$
$$ b_0 = log_e(pos/neg)$$
initial_bias = np.log([pos/neg])
initial_bias
array([-6.35935934])

これを初期バイアスとして設定すると、モデルははるかに合理的な初期推定を提供します。

それは近くにする必要があります: pos/total = 0.0018

model = make_model(output_bias=initial_bias)
model.predict(train_features[:10])
array([[0.00208238],
       [0.00070042],
       [0.00052977],
       [0.00079731],
       [0.00025865],
       [0.00088699],
       [0.00056424],
       [0.00120926],
       [0.00046146],
       [0.0012216 ]], dtype=float32)

この初期化では、初期損失はおおよそ次のようになります。

$$-p_0log(p_0)-(1-p_0)log(1-p_0) = 0.01317$$
results = model.evaluate(train_features, train_labels, batch_size=BATCH_SIZE, verbose=0)
print("Loss: {:0.4f}".format(results[0]))
Loss: 0.0140

この初期損失は、単純な初期化の場合の約50分の1です。

このように、モデルは最初の数エポックを費やす必要がなく、ポジティブな例がありそうもないことを学ぶだけです。これにより、トレーニング中の損失のプロットを読みやすくなります。

初期の重みをチェックポイントします

さまざまなトレーニングの実行をより比較可能にするために、この初期モデルの重みをチェックポイントファイルに保持し、トレーニングの前に各モデルにロードします。

initial_weights = os.path.join(tempfile.mkdtemp(), 'initial_weights')
model.save_weights(initial_weights)

バイアス修正が役立つことを確認する

先に進む前に、注意深いバイアスの初期化が実際に役立ったことをすばやく確認してください。

この注意深い初期化の有無にかかわらず、20エポックのモデルをトレーニングし、損失を比較します。

model = make_model()
model.load_weights(initial_weights)
model.layers[-1].bias.assign([0.0])
zero_bias_history = model.fit(
    train_features,
    train_labels,
    batch_size=BATCH_SIZE,
    epochs=20,
    validation_data=(val_features, val_labels), 
    verbose=0)
model = make_model()
model.load_weights(initial_weights)
careful_bias_history = model.fit(
    train_features,
    train_labels,
    batch_size=BATCH_SIZE,
    epochs=20,
    validation_data=(val_features, val_labels), 
    verbose=0)
def plot_loss(history, label, n):
  # Use a log scale on y-axis to show the wide range of values.
  plt.semilogy(history.epoch, history.history['loss'],
               color=colors[n], label='Train ' + label)
  plt.semilogy(history.epoch, history.history['val_loss'],
               color=colors[n], label='Val ' + label,
               linestyle="--")
  plt.xlabel('Epoch')
  plt.ylabel('Loss')
plot_loss(zero_bias_history, "Zero Bias", 0)
plot_loss(careful_bias_history, "Careful Bias", 1)

png

上の図はそれを明確にしています:検証の損失に関して、この問題に関して、この注意深い初期化は明らかな利点を与えます。

モデルをトレーニングする

model = make_model()
model.load_weights(initial_weights)
baseline_history = model.fit(
    train_features,
    train_labels,
    batch_size=BATCH_SIZE,
    epochs=EPOCHS,
    callbacks=[early_stopping],
    validation_data=(val_features, val_labels))
Epoch 1/100
90/90 [==============================] - 3s 15ms/step - loss: 0.0102 - tp: 97.0000 - fp: 12.0000 - tn: 227448.0000 - fn: 288.0000 - accuracy: 0.9987 - precision: 0.8899 - recall: 0.2519 - auc: 0.7719 - prc: 0.3759 - val_loss: 0.0067 - val_tp: 27.0000 - val_fp: 7.0000 - val_tn: 45469.0000 - val_fn: 66.0000 - val_accuracy: 0.9984 - val_precision: 0.7941 - val_recall: 0.2903 - val_auc: 0.9028 - val_prc: 0.6931
Epoch 2/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0061 - tp: 120.0000 - fp: 22.0000 - tn: 181962.0000 - fn: 172.0000 - accuracy: 0.9989 - precision: 0.8451 - recall: 0.4110 - auc: 0.9071 - prc: 0.5482 - val_loss: 0.0053 - val_tp: 53.0000 - val_fp: 11.0000 - val_tn: 45465.0000 - val_fn: 40.0000 - val_accuracy: 0.9989 - val_precision: 0.8281 - val_recall: 0.5699 - val_auc: 0.9029 - val_prc: 0.6913
Epoch 3/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0056 - tp: 144.0000 - fp: 30.0000 - tn: 181954.0000 - fn: 148.0000 - accuracy: 0.9990 - precision: 0.8276 - recall: 0.4932 - auc: 0.9087 - prc: 0.6161 - val_loss: 0.0050 - val_tp: 53.0000 - val_fp: 11.0000 - val_tn: 45465.0000 - val_fn: 40.0000 - val_accuracy: 0.9989 - val_precision: 0.8281 - val_recall: 0.5699 - val_auc: 0.9083 - val_prc: 0.6999
Epoch 4/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0048 - tp: 155.0000 - fp: 27.0000 - tn: 181957.0000 - fn: 137.0000 - accuracy: 0.9991 - precision: 0.8516 - recall: 0.5308 - auc: 0.9061 - prc: 0.6591 - val_loss: 0.0047 - val_tp: 55.0000 - val_fp: 11.0000 - val_tn: 45465.0000 - val_fn: 38.0000 - val_accuracy: 0.9989 - val_precision: 0.8333 - val_recall: 0.5914 - val_auc: 0.9137 - val_prc: 0.7217
Epoch 5/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0048 - tp: 162.0000 - fp: 29.0000 - tn: 181955.0000 - fn: 130.0000 - accuracy: 0.9991 - precision: 0.8482 - recall: 0.5548 - auc: 0.9096 - prc: 0.6450 - val_loss: 0.0045 - val_tp: 65.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 28.0000 - val_accuracy: 0.9991 - val_precision: 0.8442 - val_recall: 0.6989 - val_auc: 0.9191 - val_prc: 0.7340
Epoch 6/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0046 - tp: 161.0000 - fp: 34.0000 - tn: 181950.0000 - fn: 131.0000 - accuracy: 0.9991 - precision: 0.8256 - recall: 0.5514 - auc: 0.9236 - prc: 0.6615 - val_loss: 0.0043 - val_tp: 65.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 28.0000 - val_accuracy: 0.9991 - val_precision: 0.8442 - val_recall: 0.6989 - val_auc: 0.9191 - val_prc: 0.7451
Epoch 7/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0044 - tp: 160.0000 - fp: 32.0000 - tn: 181952.0000 - fn: 132.0000 - accuracy: 0.9991 - precision: 0.8333 - recall: 0.5479 - auc: 0.9220 - prc: 0.6752 - val_loss: 0.0042 - val_tp: 70.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 23.0000 - val_accuracy: 0.9992 - val_precision: 0.8537 - val_recall: 0.7527 - val_auc: 0.9191 - val_prc: 0.7481
Epoch 8/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0042 - tp: 176.0000 - fp: 28.0000 - tn: 181956.0000 - fn: 116.0000 - accuracy: 0.9992 - precision: 0.8627 - recall: 0.6027 - auc: 0.9203 - prc: 0.6962 - val_loss: 0.0041 - val_tp: 70.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 23.0000 - val_accuracy: 0.9992 - val_precision: 0.8537 - val_recall: 0.7527 - val_auc: 0.9245 - val_prc: 0.7645
Epoch 9/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0038 - tp: 176.0000 - fp: 33.0000 - tn: 181951.0000 - fn: 116.0000 - accuracy: 0.9992 - precision: 0.8421 - recall: 0.6027 - auc: 0.9307 - prc: 0.7292 - val_loss: 0.0040 - val_tp: 73.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 20.0000 - val_accuracy: 0.9993 - val_precision: 0.8588 - val_recall: 0.7849 - val_auc: 0.9245 - val_prc: 0.7672
Epoch 10/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0040 - tp: 188.0000 - fp: 30.0000 - tn: 181954.0000 - fn: 104.0000 - accuracy: 0.9993 - precision: 0.8624 - recall: 0.6438 - auc: 0.9204 - prc: 0.7010 - val_loss: 0.0040 - val_tp: 73.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 20.0000 - val_accuracy: 0.9993 - val_precision: 0.8588 - val_recall: 0.7849 - val_auc: 0.9245 - val_prc: 0.7730
Epoch 11/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0037 - tp: 177.0000 - fp: 27.0000 - tn: 181957.0000 - fn: 115.0000 - accuracy: 0.9992 - precision: 0.8676 - recall: 0.6062 - auc: 0.9256 - prc: 0.7327 - val_loss: 0.0039 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9298 - val_prc: 0.7826
Epoch 12/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0039 - tp: 185.0000 - fp: 35.0000 - tn: 181949.0000 - fn: 107.0000 - accuracy: 0.9992 - precision: 0.8409 - recall: 0.6336 - auc: 0.9204 - prc: 0.7063 - val_loss: 0.0038 - val_tp: 73.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 20.0000 - val_accuracy: 0.9993 - val_precision: 0.8588 - val_recall: 0.7849 - val_auc: 0.9299 - val_prc: 0.7926
Epoch 13/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0039 - tp: 162.0000 - fp: 34.0000 - tn: 181950.0000 - fn: 130.0000 - accuracy: 0.9991 - precision: 0.8265 - recall: 0.5548 - auc: 0.9221 - prc: 0.7139 - val_loss: 0.0038 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9298 - val_prc: 0.7950
Epoch 14/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0039 - tp: 177.0000 - fp: 32.0000 - tn: 181952.0000 - fn: 115.0000 - accuracy: 0.9992 - precision: 0.8469 - recall: 0.6062 - auc: 0.9341 - prc: 0.7139 - val_loss: 0.0037 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9298 - val_prc: 0.7969
Epoch 15/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0034 - tp: 182.0000 - fp: 26.0000 - tn: 181958.0000 - fn: 110.0000 - accuracy: 0.9993 - precision: 0.8750 - recall: 0.6233 - auc: 0.9342 - prc: 0.7486 - val_loss: 0.0037 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9298 - val_prc: 0.8019
Epoch 16/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0038 - tp: 178.0000 - fp: 31.0000 - tn: 181953.0000 - fn: 114.0000 - accuracy: 0.9992 - precision: 0.8517 - recall: 0.6096 - auc: 0.9375 - prc: 0.7045 - val_loss: 0.0037 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9299 - val_prc: 0.8037
Epoch 17/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0037 - tp: 178.0000 - fp: 33.0000 - tn: 181951.0000 - fn: 114.0000 - accuracy: 0.9992 - precision: 0.8436 - recall: 0.6096 - auc: 0.9307 - prc: 0.7131 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 11.0000 - val_tn: 45465.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8721 - val_recall: 0.8065 - val_auc: 0.9299 - val_prc: 0.8112
Epoch 18/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0036 - tp: 179.0000 - fp: 27.0000 - tn: 181957.0000 - fn: 113.0000 - accuracy: 0.9992 - precision: 0.8689 - recall: 0.6130 - auc: 0.9376 - prc: 0.7306 - val_loss: 0.0037 - val_tp: 75.0000 - val_fp: 13.0000 - val_tn: 45463.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8523 - val_recall: 0.8065 - val_auc: 0.9298 - val_prc: 0.8037
Epoch 19/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0036 - tp: 181.0000 - fp: 36.0000 - tn: 181948.0000 - fn: 111.0000 - accuracy: 0.9992 - precision: 0.8341 - recall: 0.6199 - auc: 0.9375 - prc: 0.7297 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 11.0000 - val_tn: 45465.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8721 - val_recall: 0.8065 - val_auc: 0.9299 - val_prc: 0.8133
Epoch 20/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0036 - tp: 176.0000 - fp: 29.0000 - tn: 181955.0000 - fn: 116.0000 - accuracy: 0.9992 - precision: 0.8585 - recall: 0.6027 - auc: 0.9324 - prc: 0.7241 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 11.0000 - val_tn: 45465.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8721 - val_recall: 0.8065 - val_auc: 0.9299 - val_prc: 0.8154
Epoch 21/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0033 - tp: 183.0000 - fp: 26.0000 - tn: 181958.0000 - fn: 109.0000 - accuracy: 0.9993 - precision: 0.8756 - recall: 0.6267 - auc: 0.9394 - prc: 0.7603 - val_loss: 0.0036 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9299 - val_prc: 0.8092
Epoch 22/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0035 - tp: 189.0000 - fp: 30.0000 - tn: 181954.0000 - fn: 103.0000 - accuracy: 0.9993 - precision: 0.8630 - recall: 0.6473 - auc: 0.9325 - prc: 0.7328 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9352 - val_prc: 0.8209
Epoch 23/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0037 - tp: 182.0000 - fp: 37.0000 - tn: 181947.0000 - fn: 110.0000 - accuracy: 0.9992 - precision: 0.8311 - recall: 0.6233 - auc: 0.9273 - prc: 0.7089 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 11.0000 - val_tn: 45465.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8721 - val_recall: 0.8065 - val_auc: 0.9352 - val_prc: 0.8245
Epoch 24/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0035 - tp: 178.0000 - fp: 24.0000 - tn: 181960.0000 - fn: 114.0000 - accuracy: 0.9992 - precision: 0.8812 - recall: 0.6096 - auc: 0.9273 - prc: 0.7415 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9352 - val_prc: 0.8207
Epoch 25/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0036 - tp: 180.0000 - fp: 32.0000 - tn: 181952.0000 - fn: 112.0000 - accuracy: 0.9992 - precision: 0.8491 - recall: 0.6164 - auc: 0.9325 - prc: 0.7165 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 11.0000 - val_tn: 45465.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8721 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8300
Epoch 26/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0033 - tp: 181.0000 - fp: 28.0000 - tn: 181956.0000 - fn: 111.0000 - accuracy: 0.9992 - precision: 0.8660 - recall: 0.6199 - auc: 0.9359 - prc: 0.7632 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8326
Epoch 27/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0037 - tp: 187.0000 - fp: 30.0000 - tn: 181954.0000 - fn: 105.0000 - accuracy: 0.9993 - precision: 0.8618 - recall: 0.6404 - auc: 0.9341 - prc: 0.7137 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8331
Epoch 28/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0033 - tp: 185.0000 - fp: 30.0000 - tn: 181954.0000 - fn: 107.0000 - accuracy: 0.9992 - precision: 0.8605 - recall: 0.6336 - auc: 0.9394 - prc: 0.7519 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8326
Epoch 29/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0033 - tp: 184.0000 - fp: 28.0000 - tn: 181956.0000 - fn: 108.0000 - accuracy: 0.9993 - precision: 0.8679 - recall: 0.6301 - auc: 0.9445 - prc: 0.7497 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8307
Epoch 30/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 196.0000 - fp: 29.0000 - tn: 181955.0000 - fn: 96.0000 - accuracy: 0.9993 - precision: 0.8711 - recall: 0.6712 - auc: 0.9428 - prc: 0.7622 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8358
Epoch 31/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0034 - tp: 180.0000 - fp: 29.0000 - tn: 181955.0000 - fn: 112.0000 - accuracy: 0.9992 - precision: 0.8612 - recall: 0.6164 - auc: 0.9445 - prc: 0.7364 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8314
Epoch 32/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0034 - tp: 179.0000 - fp: 29.0000 - tn: 181955.0000 - fn: 113.0000 - accuracy: 0.9992 - precision: 0.8606 - recall: 0.6130 - auc: 0.9359 - prc: 0.7423 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8320
Epoch 33/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0034 - tp: 181.0000 - fp: 35.0000 - tn: 181949.0000 - fn: 111.0000 - accuracy: 0.9992 - precision: 0.8380 - recall: 0.6199 - auc: 0.9529 - prc: 0.7374 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8340
Epoch 34/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0033 - tp: 185.0000 - fp: 28.0000 - tn: 181956.0000 - fn: 107.0000 - accuracy: 0.9993 - precision: 0.8685 - recall: 0.6336 - auc: 0.9427 - prc: 0.7465 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8294
Epoch 35/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0032 - tp: 185.0000 - fp: 31.0000 - tn: 181953.0000 - fn: 107.0000 - accuracy: 0.9992 - precision: 0.8565 - recall: 0.6336 - auc: 0.9411 - prc: 0.7563 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8314
Epoch 36/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 198.0000 - fp: 32.0000 - tn: 181952.0000 - fn: 94.0000 - accuracy: 0.9993 - precision: 0.8609 - recall: 0.6781 - auc: 0.9411 - prc: 0.7846 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 10.0000 - val_tn: 45466.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8824 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8343
Epoch 37/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0032 - tp: 199.0000 - fp: 29.0000 - tn: 181955.0000 - fn: 93.0000 - accuracy: 0.9993 - precision: 0.8728 - recall: 0.6815 - auc: 0.9394 - prc: 0.7551 - val_loss: 0.0033 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8368
Epoch 38/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 187.0000 - fp: 27.0000 - tn: 181957.0000 - fn: 105.0000 - accuracy: 0.9993 - precision: 0.8738 - recall: 0.6404 - auc: 0.9377 - prc: 0.7595 - val_loss: 0.0033 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8370
Epoch 39/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0032 - tp: 183.0000 - fp: 32.0000 - tn: 181952.0000 - fn: 109.0000 - accuracy: 0.9992 - precision: 0.8512 - recall: 0.6267 - auc: 0.9445 - prc: 0.7628 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8385
Epoch 40/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 192.0000 - fp: 30.0000 - tn: 181954.0000 - fn: 100.0000 - accuracy: 0.9993 - precision: 0.8649 - recall: 0.6575 - auc: 0.9445 - prc: 0.7669 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8335
Epoch 41/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 198.0000 - fp: 32.0000 - tn: 181952.0000 - fn: 94.0000 - accuracy: 0.9993 - precision: 0.8609 - recall: 0.6781 - auc: 0.9428 - prc: 0.7712 - val_loss: 0.0033 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8353
Epoch 42/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 186.0000 - fp: 24.0000 - tn: 181960.0000 - fn: 106.0000 - accuracy: 0.9993 - precision: 0.8857 - recall: 0.6370 - auc: 0.9462 - prc: 0.7510 - val_loss: 0.0033 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8393
Epoch 43/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0032 - tp: 195.0000 - fp: 26.0000 - tn: 181958.0000 - fn: 97.0000 - accuracy: 0.9993 - precision: 0.8824 - recall: 0.6678 - auc: 0.9445 - prc: 0.7590 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8377
Epoch 44/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 193.0000 - fp: 27.0000 - tn: 181957.0000 - fn: 99.0000 - accuracy: 0.9993 - precision: 0.8773 - recall: 0.6610 - auc: 0.9445 - prc: 0.7633 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8402
Epoch 45/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 186.0000 - fp: 29.0000 - tn: 181955.0000 - fn: 106.0000 - accuracy: 0.9993 - precision: 0.8651 - recall: 0.6370 - auc: 0.9497 - prc: 0.7651 - val_loss: 0.0033 - val_tp: 75.0000 - val_fp: 8.0000 - val_tn: 45468.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.9036 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8420
Epoch 46/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0034 - tp: 185.0000 - fp: 32.0000 - tn: 181952.0000 - fn: 107.0000 - accuracy: 0.9992 - precision: 0.8525 - recall: 0.6336 - auc: 0.9446 - prc: 0.7480 - val_loss: 0.0033 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8405
Epoch 47/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0032 - tp: 190.0000 - fp: 27.0000 - tn: 181957.0000 - fn: 102.0000 - accuracy: 0.9993 - precision: 0.8756 - recall: 0.6507 - auc: 0.9394 - prc: 0.7536 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8398
Epoch 48/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0029 - tp: 196.0000 - fp: 26.0000 - tn: 181958.0000 - fn: 96.0000 - accuracy: 0.9993 - precision: 0.8829 - recall: 0.6712 - auc: 0.9480 - prc: 0.7816 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8401
Epoch 49/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 186.0000 - fp: 29.0000 - tn: 181955.0000 - fn: 106.0000 - accuracy: 0.9993 - precision: 0.8651 - recall: 0.6370 - auc: 0.9496 - prc: 0.7600 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8415
Epoch 50/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0029 - tp: 208.0000 - fp: 30.0000 - tn: 181954.0000 - fn: 84.0000 - accuracy: 0.9994 - precision: 0.8739 - recall: 0.7123 - auc: 0.9463 - prc: 0.7677 - val_loss: 0.0033 - val_tp: 75.0000 - val_fp: 8.0000 - val_tn: 45468.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.9036 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8425
Epoch 51/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0032 - tp: 186.0000 - fp: 30.0000 - tn: 181954.0000 - fn: 106.0000 - accuracy: 0.9993 - precision: 0.8611 - recall: 0.6370 - auc: 0.9428 - prc: 0.7606 - val_loss: 0.0033 - val_tp: 75.0000 - val_fp: 8.0000 - val_tn: 45468.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.9036 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8409
Epoch 52/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 187.0000 - fp: 24.0000 - tn: 181960.0000 - fn: 105.0000 - accuracy: 0.9993 - precision: 0.8863 - recall: 0.6404 - auc: 0.9445 - prc: 0.7717 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8432
Epoch 53/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 187.0000 - fp: 26.0000 - tn: 181958.0000 - fn: 105.0000 - accuracy: 0.9993 - precision: 0.8779 - recall: 0.6404 - auc: 0.9445 - prc: 0.7829 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8393
Epoch 54/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 189.0000 - fp: 28.0000 - tn: 181956.0000 - fn: 103.0000 - accuracy: 0.9993 - precision: 0.8710 - recall: 0.6473 - auc: 0.9497 - prc: 0.7836 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8394
Epoch 55/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 194.0000 - fp: 29.0000 - tn: 181955.0000 - fn: 98.0000 - accuracy: 0.9993 - precision: 0.8700 - recall: 0.6644 - auc: 0.9430 - prc: 0.7698 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 8.0000 - val_tn: 45468.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.9036 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8402
Epoch 56/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 189.0000 - fp: 32.0000 - tn: 181952.0000 - fn: 103.0000 - accuracy: 0.9993 - precision: 0.8552 - recall: 0.6473 - auc: 0.9429 - prc: 0.7748 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8401
Epoch 57/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 185.0000 - fp: 28.0000 - tn: 181956.0000 - fn: 107.0000 - accuracy: 0.9993 - precision: 0.8685 - recall: 0.6336 - auc: 0.9428 - prc: 0.7666 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8408
Epoch 58/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 201.0000 - fp: 27.0000 - tn: 181957.0000 - fn: 91.0000 - accuracy: 0.9994 - precision: 0.8816 - recall: 0.6884 - auc: 0.9429 - prc: 0.7760 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 8.0000 - val_tn: 45468.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.9036 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8410
Epoch 59/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 195.0000 - fp: 34.0000 - tn: 181950.0000 - fn: 97.0000 - accuracy: 0.9993 - precision: 0.8515 - recall: 0.6678 - auc: 0.9377 - prc: 0.7608 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8384
Epoch 60/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0029 - tp: 200.0000 - fp: 25.0000 - tn: 181959.0000 - fn: 92.0000 - accuracy: 0.9994 - precision: 0.8889 - recall: 0.6849 - auc: 0.9514 - prc: 0.7831 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8398
Epoch 61/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 197.0000 - fp: 32.0000 - tn: 181952.0000 - fn: 95.0000 - accuracy: 0.9993 - precision: 0.8603 - recall: 0.6747 - auc: 0.9411 - prc: 0.7588 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8420
Epoch 62/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 187.0000 - fp: 29.0000 - tn: 181955.0000 - fn: 105.0000 - accuracy: 0.9993 - precision: 0.8657 - recall: 0.6404 - auc: 0.9480 - prc: 0.7811 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8442
Epoch 63/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 197.0000 - fp: 25.0000 - tn: 181959.0000 - fn: 95.0000 - accuracy: 0.9993 - precision: 0.8874 - recall: 0.6747 - auc: 0.9445 - prc: 0.7835 - val_loss: 0.0036 - val_tp: 75.0000 - val_fp: 12.0000 - val_tn: 45464.0000 - val_fn: 18.0000 - val_accuracy: 0.9993 - val_precision: 0.8621 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8396
Epoch 64/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 197.0000 - fp: 32.0000 - tn: 181952.0000 - fn: 95.0000 - accuracy: 0.9993 - precision: 0.8603 - recall: 0.6747 - auc: 0.9496 - prc: 0.7741 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8416
Epoch 65/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 191.0000 - fp: 28.0000 - tn: 181956.0000 - fn: 101.0000 - accuracy: 0.9993 - precision: 0.8721 - recall: 0.6541 - auc: 0.9548 - prc: 0.7709 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8442
Epoch 66/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0029 - tp: 196.0000 - fp: 33.0000 - tn: 181951.0000 - fn: 96.0000 - accuracy: 0.9993 - precision: 0.8559 - recall: 0.6712 - auc: 0.9514 - prc: 0.7860 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 8.0000 - val_tn: 45468.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.9036 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8422
Epoch 67/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 189.0000 - fp: 28.0000 - tn: 181956.0000 - fn: 103.0000 - accuracy: 0.9993 - precision: 0.8710 - recall: 0.6473 - auc: 0.9445 - prc: 0.7674 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8422
Epoch 68/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 188.0000 - fp: 24.0000 - tn: 181960.0000 - fn: 104.0000 - accuracy: 0.9993 - precision: 0.8868 - recall: 0.6438 - auc: 0.9445 - prc: 0.7722 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 8.0000 - val_tn: 45468.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.9036 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8442
Epoch 69/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 182.0000 - fp: 34.0000 - tn: 181950.0000 - fn: 110.0000 - accuracy: 0.9992 - precision: 0.8426 - recall: 0.6233 - auc: 0.9463 - prc: 0.7684 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8442
Epoch 70/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0032 - tp: 196.0000 - fp: 33.0000 - tn: 181951.0000 - fn: 96.0000 - accuracy: 0.9993 - precision: 0.8559 - recall: 0.6712 - auc: 0.9428 - prc: 0.7396 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8446
Epoch 71/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0028 - tp: 196.0000 - fp: 31.0000 - tn: 181953.0000 - fn: 96.0000 - accuracy: 0.9993 - precision: 0.8634 - recall: 0.6712 - auc: 0.9497 - prc: 0.7963 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8451
Epoch 72/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0028 - tp: 209.0000 - fp: 28.0000 - tn: 181956.0000 - fn: 83.0000 - accuracy: 0.9994 - precision: 0.8819 - recall: 0.7158 - auc: 0.9497 - prc: 0.7937 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8456
Epoch 73/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0032 - tp: 184.0000 - fp: 31.0000 - tn: 181953.0000 - fn: 108.0000 - accuracy: 0.9992 - precision: 0.8558 - recall: 0.6301 - auc: 0.9445 - prc: 0.7500 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8436
Epoch 74/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0028 - tp: 196.0000 - fp: 25.0000 - tn: 181959.0000 - fn: 96.0000 - accuracy: 0.9993 - precision: 0.8869 - recall: 0.6712 - auc: 0.9531 - prc: 0.7905 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8415
Epoch 75/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0028 - tp: 201.0000 - fp: 27.0000 - tn: 181957.0000 - fn: 91.0000 - accuracy: 0.9994 - precision: 0.8816 - recall: 0.6884 - auc: 0.9463 - prc: 0.7908 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 7.0000 - val_tn: 45469.0000 - val_fn: 18.0000 - val_accuracy: 0.9995 - val_precision: 0.9146 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8463
Epoch 76/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0027 - tp: 203.0000 - fp: 22.0000 - tn: 181962.0000 - fn: 89.0000 - accuracy: 0.9994 - precision: 0.9022 - recall: 0.6952 - auc: 0.9480 - prc: 0.7991 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 8.0000 - val_tn: 45468.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.9036 - val_recall: 0.8065 - val_auc: 0.9407 - val_prc: 0.8469
Epoch 77/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0029 - tp: 198.0000 - fp: 27.0000 - tn: 181957.0000 - fn: 94.0000 - accuracy: 0.9993 - precision: 0.8800 - recall: 0.6781 - auc: 0.9463 - prc: 0.7873 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 7.0000 - val_tn: 45469.0000 - val_fn: 18.0000 - val_accuracy: 0.9995 - val_precision: 0.9146 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8467
Epoch 78/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 190.0000 - fp: 29.0000 - tn: 181955.0000 - fn: 102.0000 - accuracy: 0.9993 - precision: 0.8676 - recall: 0.6507 - auc: 0.9479 - prc: 0.7714 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 7.0000 - val_tn: 45469.0000 - val_fn: 18.0000 - val_accuracy: 0.9995 - val_precision: 0.9146 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8462
Epoch 79/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 186.0000 - fp: 27.0000 - tn: 181957.0000 - fn: 106.0000 - accuracy: 0.9993 - precision: 0.8732 - recall: 0.6370 - auc: 0.9445 - prc: 0.7737 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 7.0000 - val_tn: 45469.0000 - val_fn: 18.0000 - val_accuracy: 0.9995 - val_precision: 0.9146 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8477
Epoch 80/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0029 - tp: 197.0000 - fp: 23.0000 - tn: 181961.0000 - fn: 95.0000 - accuracy: 0.9994 - precision: 0.8955 - recall: 0.6747 - auc: 0.9515 - prc: 0.7805 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 8.0000 - val_tn: 45468.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.9036 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8466
Epoch 81/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0029 - tp: 200.0000 - fp: 32.0000 - tn: 181952.0000 - fn: 92.0000 - accuracy: 0.9993 - precision: 0.8621 - recall: 0.6849 - auc: 0.9480 - prc: 0.7835 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 8.0000 - val_tn: 45468.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.9036 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8473
Epoch 82/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 191.0000 - fp: 33.0000 - tn: 181951.0000 - fn: 101.0000 - accuracy: 0.9993 - precision: 0.8527 - recall: 0.6541 - auc: 0.9376 - prc: 0.7640 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 7.0000 - val_tn: 45469.0000 - val_fn: 18.0000 - val_accuracy: 0.9995 - val_precision: 0.9146 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8456
Epoch 83/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0029 - tp: 188.0000 - fp: 24.0000 - tn: 181960.0000 - fn: 104.0000 - accuracy: 0.9993 - precision: 0.8868 - recall: 0.6438 - auc: 0.9531 - prc: 0.7866 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8427
Epoch 84/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0032 - tp: 184.0000 - fp: 30.0000 - tn: 181954.0000 - fn: 108.0000 - accuracy: 0.9992 - precision: 0.8598 - recall: 0.6301 - auc: 0.9479 - prc: 0.7515 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 8.0000 - val_tn: 45468.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.9036 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8446
Epoch 85/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 191.0000 - fp: 26.0000 - tn: 181958.0000 - fn: 101.0000 - accuracy: 0.9993 - precision: 0.8802 - recall: 0.6541 - auc: 0.9514 - prc: 0.7796 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8414
Epoch 86/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0029 - tp: 199.0000 - fp: 37.0000 - tn: 181947.0000 - fn: 93.0000 - accuracy: 0.9993 - precision: 0.8432 - recall: 0.6815 - auc: 0.9446 - prc: 0.7978 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 7.0000 - val_tn: 45469.0000 - val_fn: 18.0000 - val_accuracy: 0.9995 - val_precision: 0.9146 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8487
Epoch 87/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 196.0000 - fp: 27.0000 - tn: 181957.0000 - fn: 96.0000 - accuracy: 0.9993 - precision: 0.8789 - recall: 0.6712 - auc: 0.9480 - prc: 0.7664 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8435
Epoch 88/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0029 - tp: 188.0000 - fp: 29.0000 - tn: 181955.0000 - fn: 104.0000 - accuracy: 0.9993 - precision: 0.8664 - recall: 0.6438 - auc: 0.9497 - prc: 0.7855 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 7.0000 - val_tn: 45469.0000 - val_fn: 18.0000 - val_accuracy: 0.9995 - val_precision: 0.9146 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8480
Epoch 89/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0027 - tp: 198.0000 - fp: 26.0000 - tn: 181958.0000 - fn: 94.0000 - accuracy: 0.9993 - precision: 0.8839 - recall: 0.6781 - auc: 0.9531 - prc: 0.8029 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 10.0000 - val_tn: 45466.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8824 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8452
Epoch 90/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0029 - tp: 196.0000 - fp: 30.0000 - tn: 181954.0000 - fn: 96.0000 - accuracy: 0.9993 - precision: 0.8673 - recall: 0.6712 - auc: 0.9463 - prc: 0.7780 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 7.0000 - val_tn: 45469.0000 - val_fn: 18.0000 - val_accuracy: 0.9995 - val_precision: 0.9146 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8475
Epoch 91/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 192.0000 - fp: 30.0000 - tn: 181954.0000 - fn: 100.0000 - accuracy: 0.9993 - precision: 0.8649 - recall: 0.6575 - auc: 0.9497 - prc: 0.7627 - val_loss: 0.0034 - val_tp: 75.0000 - val_fp: 7.0000 - val_tn: 45469.0000 - val_fn: 18.0000 - val_accuracy: 0.9995 - val_precision: 0.9146 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8488
Epoch 92/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0026 - tp: 198.0000 - fp: 25.0000 - tn: 181959.0000 - fn: 94.0000 - accuracy: 0.9993 - precision: 0.8879 - recall: 0.6781 - auc: 0.9532 - prc: 0.8057 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 8.0000 - val_tn: 45468.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.9036 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8466
Epoch 93/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 195.0000 - fp: 33.0000 - tn: 181951.0000 - fn: 97.0000 - accuracy: 0.9993 - precision: 0.8553 - recall: 0.6678 - auc: 0.9497 - prc: 0.7694 - val_loss: 0.0034 - val_tp: 74.0000 - val_fp: 7.0000 - val_tn: 45469.0000 - val_fn: 19.0000 - val_accuracy: 0.9994 - val_precision: 0.9136 - val_recall: 0.7957 - val_auc: 0.9406 - val_prc: 0.8493
Epoch 94/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 188.0000 - fp: 28.0000 - tn: 181956.0000 - fn: 104.0000 - accuracy: 0.9993 - precision: 0.8704 - recall: 0.6438 - auc: 0.9531 - prc: 0.7693 - val_loss: 0.0034 - val_tp: 74.0000 - val_fp: 7.0000 - val_tn: 45469.0000 - val_fn: 19.0000 - val_accuracy: 0.9994 - val_precision: 0.9136 - val_recall: 0.7957 - val_auc: 0.9406 - val_prc: 0.8478
Epoch 95/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0029 - tp: 189.0000 - fp: 24.0000 - tn: 181960.0000 - fn: 103.0000 - accuracy: 0.9993 - precision: 0.8873 - recall: 0.6473 - auc: 0.9513 - prc: 0.7776 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 8.0000 - val_tn: 45468.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.9036 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8478
Epoch 96/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0031 - tp: 193.0000 - fp: 30.0000 - tn: 181954.0000 - fn: 99.0000 - accuracy: 0.9993 - precision: 0.8655 - recall: 0.6610 - auc: 0.9513 - prc: 0.7618 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 7.0000 - val_tn: 45469.0000 - val_fn: 18.0000 - val_accuracy: 0.9995 - val_precision: 0.9146 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8448
Epoch 97/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0028 - tp: 195.0000 - fp: 24.0000 - tn: 181960.0000 - fn: 97.0000 - accuracy: 0.9993 - precision: 0.8904 - recall: 0.6678 - auc: 0.9480 - prc: 0.7850 - val_loss: 0.0036 - val_tp: 75.0000 - val_fp: 10.0000 - val_tn: 45466.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8824 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8428
Epoch 98/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0030 - tp: 193.0000 - fp: 26.0000 - tn: 181958.0000 - fn: 99.0000 - accuracy: 0.9993 - precision: 0.8813 - recall: 0.6610 - auc: 0.9411 - prc: 0.7685 - val_loss: 0.0036 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8455
Epoch 99/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0028 - tp: 198.0000 - fp: 26.0000 - tn: 181958.0000 - fn: 94.0000 - accuracy: 0.9993 - precision: 0.8839 - recall: 0.6781 - auc: 0.9412 - prc: 0.7922 - val_loss: 0.0036 - val_tp: 75.0000 - val_fp: 9.0000 - val_tn: 45467.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8929 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8462
Epoch 100/100
90/90 [==============================] - 1s 6ms/step - loss: 0.0027 - tp: 199.0000 - fp: 23.0000 - tn: 181961.0000 - fn: 93.0000 - accuracy: 0.9994 - precision: 0.8964 - recall: 0.6815 - auc: 0.9532 - prc: 0.8008 - val_loss: 0.0035 - val_tp: 75.0000 - val_fp: 7.0000 - val_tn: 45469.0000 - val_fn: 18.0000 - val_accuracy: 0.9995 - val_precision: 0.9146 - val_recall: 0.8065 - val_auc: 0.9406 - val_prc: 0.8484

トレーニング履歴を確認する

このセクションでは、トレーニングと検証のセットでモデルの精度と損失のプロットを作成します。これらはあなたがより約学ぶことができ、過剰適合を確認するのに有用であるオーバーフィットとunderfitチュートリアル。

さらに、上記で作成した任意のメトリックに対してこれらのプロットを作成できます。偽陰性は例として含まれています。

def plot_metrics(history):
  metrics = ['loss', 'prc', 'precision', 'recall']
  for n, metric in enumerate(metrics):
    name = metric.replace("_"," ").capitalize()
    plt.subplot(2,2,n+1)
    plt.plot(history.epoch, history.history[metric], color=colors[0], label='Train')
    plt.plot(history.epoch, history.history['val_'+metric],
             color=colors[0], linestyle="--", label='Val')
    plt.xlabel('Epoch')
    plt.ylabel(name)
    if metric == 'loss':
      plt.ylim([0, plt.ylim()[1]])
    elif metric == 'auc':
      plt.ylim([0.8,1])
    else:
      plt.ylim([0,1])

    plt.legend()
plot_metrics(baseline_history)

png

指標を評価する

あなたは使用することができます混同行列をX軸は予測ラベルであり、Yは、軸実対予測したラベルを、要約する実際のラベルです。

train_predictions_baseline = model.predict(train_features, batch_size=BATCH_SIZE)
test_predictions_baseline = model.predict(test_features, batch_size=BATCH_SIZE)
def plot_cm(labels, predictions, p=0.5):
  cm = confusion_matrix(labels, predictions > p)
  plt.figure(figsize=(5,5))
  sns.heatmap(cm, annot=True, fmt="d")
  plt.title('Confusion matrix @{:.2f}'.format(p))
  plt.ylabel('Actual label')
  plt.xlabel('Predicted label')

  print('Legitimate Transactions Detected (True Negatives): ', cm[0][0])
  print('Legitimate Transactions Incorrectly Detected (False Positives): ', cm[0][1])
  print('Fraudulent Transactions Missed (False Negatives): ', cm[1][0])
  print('Fraudulent Transactions Detected (True Positives): ', cm[1][1])
  print('Total Fraudulent Transactions: ', np.sum(cm[1]))

テストデータセットでモデルを評価し、上記で作成したメトリックの結果を表示します。

baseline_results = model.evaluate(test_features, test_labels,
                                  batch_size=BATCH_SIZE, verbose=0)
for name, value in zip(model.metrics_names, baseline_results):
  print(name, ': ', value)
print()

plot_cm(test_labels, test_predictions_baseline)
loss :  0.004034834913909435
tp :  83.0
fp :  6.0
tn :  56849.0
fn :  24.0
accuracy :  0.9994733333587646
precision :  0.932584285736084
recall :  0.7757009267807007
auc :  0.9296237826347351
prc :  0.8169150352478027

Legitimate Transactions Detected (True Negatives):  56849
Legitimate Transactions Incorrectly Detected (False Positives):  6
Fraudulent Transactions Missed (False Negatives):  24
Fraudulent Transactions Detected (True Positives):  83
Total Fraudulent Transactions:  107

png

モデルは完全にすべてを予測していた場合、これは次のようになり、対角行列間違った予測を示す主対角オフ値は、ゼロになります。この場合、マトリックスは、誤検知が比較的少ないことを示しています。つまり、誤ってフラグが付けられた正当なトランザクションが比較的少ないことを意味します。ただし、誤検知の数を増やすコストがあるにもかかわらず、誤検知をさらに少なくしたい場合があります。誤検知により不正な取引が発生する可能性があるのに対し、誤検知により顧客にカードのアクティビティを確認するように求める電子メールが送信される可能性があるため、このトレードオフが望ましい場合があります。

ROCをプロットする

今プロットROCを。このプロットは、出力しきい値を調整するだけでモデルが到達できるパフォーマンスの範囲を一目で示すため、便利です。

def plot_roc(name, labels, predictions, **kwargs):
  fp, tp, _ = sklearn.metrics.roc_curve(labels, predictions)

  plt.plot(100*fp, 100*tp, label=name, linewidth=2, **kwargs)
  plt.xlabel('False positives [%]')
  plt.ylabel('True positives [%]')
  plt.xlim([-0.5,20])
  plt.ylim([80,100.5])
  plt.grid(True)
  ax = plt.gca()
  ax.set_aspect('equal')
plot_roc("Train Baseline", train_labels, train_predictions_baseline, color=colors[0])
plot_roc("Test Baseline", test_labels, test_predictions_baseline, color=colors[0], linestyle='--')
plt.legend(loc='lower right')
<matplotlib.legend.Legend at 0x7f754c196450>

png

AUPRCをプロットします

今プロットAUPRCを。分類しきい値のさまざまな値に対して(再現率、適合率)ポイントをプロットすることによって取得された、補間された適合率-再現率曲線の下の領域。計算方法によっては、PRAUCはモデルの平均精度と同等になる場合があります。

def plot_prc(name, labels, predictions, **kwargs):
    precision, recall, _ = sklearn.metrics.precision_recall_curve(labels, predictions)

    plt.plot(precision, recall, label=name, linewidth=2, **kwargs)
    plt.xlabel('Recall')
    plt.ylabel('Precision')
    plt.grid(True)
    ax = plt.gca()
    ax.set_aspect('equal')
plot_prc("Train Baseline", train_labels, train_predictions_baseline, color=colors[0])
plot_prc("Test Baseline", test_labels, test_predictions_baseline, color=colors[0], linestyle='--')
plt.legend(loc='lower right')
<matplotlib.legend.Legend at 0x7f754c19c490>

png

精度は比較的高いように見えますが、再現率とROC曲線下面積(AUC)は思ったほど高くありません。分類器は、適合率と再現率の両方を最大化しようとすると、しばしば課題に直面します。これは、不均衡なデータセットを操作する場合に特に当てはまります。関心のある問題のコンテキストで、さまざまなタイプのエラーのコストを考慮することが重要です。この例では、誤検知(不正なトランザクションが見落とされる)には金銭的コストがかかる可能性があり、誤検知(トランザクションに不正なフラグが誤って付けられる)はユーザーの幸福を低下させる可能性があります。

クラスの重み

クラスの重みを計算する

目標は不正なトランザクションを特定することですが、使用できるポジティブサンプルはそれほど多くないため、利用可能ないくつかの例に分類子を大きく重み付けする必要があります。これを行うには、各クラスのKerasの重みをパラメーターに渡します。これらにより、モデルは、過小評価されているクラスの例に「より注意を払う」ようになります。

# Scaling by total/2 helps keep the loss to a similar magnitude.
# The sum of the weights of all examples stays the same.
weight_for_0 = (1 / neg) * (total / 2.0)
weight_for_1 = (1 / pos) * (total / 2.0)

class_weight = {0: weight_for_0, 1: weight_for_1}

print('Weight for class 0: {:.2f}'.format(weight_for_0))
print('Weight for class 1: {:.2f}'.format(weight_for_1))
Weight for class 0: 0.50
Weight for class 1: 289.44

クラスの重みを使用してモデルをトレーニングする

次に、クラスの重みを使用してモデルを再トレーニングおよび評価して、それが予測にどのように影響するかを確認します。

weighted_model = make_model()
weighted_model.load_weights(initial_weights)

weighted_history = weighted_model.fit(
    train_features,
    train_labels,
    batch_size=BATCH_SIZE,
    epochs=EPOCHS,
    callbacks=[early_stopping],
    validation_data=(val_features, val_labels),
    # The class weights go here
    class_weight=class_weight)
WARNING:tensorflow:From /tmpfs/src/tf_docs_env/lib/python3.7/site-packages/tensorflow/python/ops/array_ops.py:5049: calling gather (from tensorflow.python.ops.array_ops) with validate_indices is deprecated and will be removed in a future version.
Instructions for updating:
The `validate_indices` argument has no effect. Indices are always validated on CPU and never validated on GPU.
Epoch 1/100
90/90 [==============================] - 3s 15ms/step - loss: 1.9981 - tp: 125.0000 - fp: 15.0000 - tn: 238824.0000 - fn: 274.0000 - accuracy: 0.9988 - precision: 0.8929 - recall: 0.3133 - auc: 0.8291 - prc: 0.4539 - val_loss: 0.0062 - val_tp: 42.0000 - val_fp: 10.0000 - val_tn: 45466.0000 - val_fn: 51.0000 - val_accuracy: 0.9987 - val_precision: 0.8077 - val_recall: 0.4516 - val_auc: 0.9070 - val_prc: 0.6916
Epoch 2/100
90/90 [==============================] - 1s 6ms/step - loss: 0.8565 - tp: 158.0000 - fp: 60.0000 - tn: 181924.0000 - fn: 134.0000 - accuracy: 0.9989 - precision: 0.7248 - recall: 0.5411 - auc: 0.9138 - prc: 0.5554 - val_loss: 0.0067 - val_tp: 74.0000 - val_fp: 13.0000 - val_tn: 45463.0000 - val_fn: 19.0000 - val_accuracy: 0.9993 - val_precision: 0.8506 - val_recall: 0.7957 - val_auc: 0.9353 - val_prc: 0.6599
Epoch 3/100
90/90 [==============================] - 1s 7ms/step - loss: 0.6161 - tp: 207.0000 - fp: 116.0000 - tn: 181868.0000 - fn: 85.0000 - accuracy: 0.9989 - precision: 0.6409 - recall: 0.7089 - auc: 0.9221 - prc: 0.6361 - val_loss: 0.0082 - val_tp: 75.0000 - val_fp: 20.0000 - val_tn: 45456.0000 - val_fn: 18.0000 - val_accuracy: 0.9992 - val_precision: 0.7895 - val_recall: 0.8065 - val_auc: 0.9544 - val_prc: 0.6868
Epoch 4/100
90/90 [==============================] - 1s 6ms/step - loss: 0.4696 - tp: 224.0000 - fp: 202.0000 - tn: 181782.0000 - fn: 68.0000 - accuracy: 0.9985 - precision: 0.5258 - recall: 0.7671 - auc: 0.9450 - prc: 0.6351 - val_loss: 0.0099 - val_tp: 75.0000 - val_fp: 26.0000 - val_tn: 45450.0000 - val_fn: 18.0000 - val_accuracy: 0.9990 - val_precision: 0.7426 - val_recall: 0.8065 - val_auc: 0.9613 - val_prc: 0.7029
Epoch 5/100
90/90 [==============================] - 1s 7ms/step - loss: 0.4348 - tp: 225.0000 - fp: 368.0000 - tn: 181616.0000 - fn: 67.0000 - accuracy: 0.9976 - precision: 0.3794 - recall: 0.7705 - auc: 0.9536 - prc: 0.6027 - val_loss: 0.0124 - val_tp: 75.0000 - val_fp: 35.0000 - val_tn: 45441.0000 - val_fn: 18.0000 - val_accuracy: 0.9988 - val_precision: 0.6818 - val_recall: 0.8065 - val_auc: 0.9639 - val_prc: 0.7157
Epoch 6/100
90/90 [==============================] - 1s 6ms/step - loss: 0.3410 - tp: 234.0000 - fp: 794.0000 - tn: 181190.0000 - fn: 58.0000 - accuracy: 0.9953 - precision: 0.2276 - recall: 0.8014 - auc: 0.9630 - prc: 0.5936 - val_loss: 0.0162 - val_tp: 76.0000 - val_fp: 63.0000 - val_tn: 45413.0000 - val_fn: 17.0000 - val_accuracy: 0.9982 - val_precision: 0.5468 - val_recall: 0.8172 - val_auc: 0.9768 - val_prc: 0.7245
Epoch 7/100
90/90 [==============================] - 1s 6ms/step - loss: 0.3527 - tp: 236.0000 - fp: 1177.0000 - tn: 180807.0000 - fn: 56.0000 - accuracy: 0.9932 - precision: 0.1670 - recall: 0.8082 - auc: 0.9532 - prc: 0.5324 - val_loss: 0.0209 - val_tp: 77.0000 - val_fp: 96.0000 - val_tn: 45380.0000 - val_fn: 16.0000 - val_accuracy: 0.9975 - val_precision: 0.4451 - val_recall: 0.8280 - val_auc: 0.9751 - val_prc: 0.7039
Epoch 8/100
90/90 [==============================] - 1s 6ms/step - loss: 0.3186 - tp: 241.0000 - fp: 1611.0000 - tn: 180373.0000 - fn: 51.0000 - accuracy: 0.9909 - precision: 0.1301 - recall: 0.8253 - auc: 0.9506 - prc: 0.4802 - val_loss: 0.0259 - val_tp: 79.0000 - val_fp: 165.0000 - val_tn: 45311.0000 - val_fn: 14.0000 - val_accuracy: 0.9961 - val_precision: 0.3238 - val_recall: 0.8495 - val_auc: 0.9772 - val_prc: 0.6656
Epoch 9/100
90/90 [==============================] - 1s 6ms/step - loss: 0.2194 - tp: 252.0000 - fp: 2137.0000 - tn: 179847.0000 - fn: 40.0000 - accuracy: 0.9881 - precision: 0.1055 - recall: 0.8630 - auc: 0.9760 - prc: 0.4433 - val_loss: 0.0324 - val_tp: 83.0000 - val_fp: 300.0000 - val_tn: 45176.0000 - val_fn: 10.0000 - val_accuracy: 0.9932 - val_precision: 0.2167 - val_recall: 0.8925 - val_auc: 0.9816 - val_prc: 0.6486
Epoch 10/100
90/90 [==============================] - 1s 6ms/step - loss: 0.2573 - tp: 246.0000 - fp: 2753.0000 - tn: 179231.0000 - fn: 46.0000 - accuracy: 0.9846 - precision: 0.0820 - recall: 0.8425 - auc: 0.9672 - prc: 0.3566 - val_loss: 0.0393 - val_tp: 83.0000 - val_fp: 422.0000 - val_tn: 45054.0000 - val_fn: 10.0000 - val_accuracy: 0.9905 - val_precision: 0.1644 - val_recall: 0.8925 - val_auc: 0.9796 - val_prc: 0.6367
Epoch 11/100
90/90 [==============================] - 1s 6ms/step - loss: 0.2845 - tp: 250.0000 - fp: 3369.0000 - tn: 178615.0000 - fn: 42.0000 - accuracy: 0.9813 - precision: 0.0691 - recall: 0.8562 - auc: 0.9576 - prc: 0.3322 - val_loss: 0.0482 - val_tp: 83.0000 - val_fp: 525.0000 - val_tn: 44951.0000 - val_fn: 10.0000 - val_accuracy: 0.9883 - val_precision: 0.1365 - val_recall: 0.8925 - val_auc: 0.9811 - val_prc: 0.6220
Epoch 12/100
90/90 [==============================] - 1s 6ms/step - loss: 0.2208 - tp: 252.0000 - fp: 3985.0000 - tn: 177999.0000 - fn: 40.0000 - accuracy: 0.9779 - precision: 0.0595 - recall: 0.8630 - auc: 0.9737 - prc: 0.2748 - val_loss: 0.0553 - val_tp: 83.0000 - val_fp: 605.0000 - val_tn: 44871.0000 - val_fn: 10.0000 - val_accuracy: 0.9865 - val_precision: 0.1206 - val_recall: 0.8925 - val_auc: 0.9810 - val_prc: 0.6086
Epoch 13/100
90/90 [==============================] - 1s 6ms/step - loss: 0.2170 - tp: 256.0000 - fp: 4571.0000 - tn: 177413.0000 - fn: 36.0000 - accuracy: 0.9747 - precision: 0.0530 - recall: 0.8767 - auc: 0.9715 - prc: 0.2528 - val_loss: 0.0599 - val_tp: 83.0000 - val_fp: 642.0000 - val_tn: 44834.0000 - val_fn: 10.0000 - val_accuracy: 0.9857 - val_precision: 0.1145 - val_recall: 0.8925 - val_auc: 0.9825 - val_prc: 0.6033
Epoch 14/100
90/90 [==============================] - 1s 6ms/step - loss: 0.2035 - tp: 258.0000 - fp: 4974.0000 - tn: 177010.0000 - fn: 34.0000 - accuracy: 0.9725 - precision: 0.0493 - recall: 0.8836 - auc: 0.9748 - prc: 0.2492 - val_loss: 0.0633 - val_tp: 83.0000 - val_fp: 675.0000 - val_tn: 44801.0000 - val_fn: 10.0000 - val_accuracy: 0.9850 - val_precision: 0.1095 - val_recall: 0.8925 - val_auc: 0.9823 - val_prc: 0.5979
Epoch 15/100
90/90 [==============================] - 1s 7ms/step - loss: 0.2184 - tp: 259.0000 - fp: 5469.0000 - tn: 176515.0000 - fn: 33.0000 - accuracy: 0.9698 - precision: 0.0452 - recall: 0.8870 - auc: 0.9698 - prc: 0.2335 - val_loss: 0.0693 - val_tp: 83.0000 - val_fp: 756.0000 - val_tn: 44720.0000 - val_fn: 10.0000 - val_accuracy: 0.9832 - val_precision: 0.0989 - val_recall: 0.8925 - val_auc: 0.9825 - val_prc: 0.5837
Epoch 16/100
90/90 [==============================] - 1s 7ms/step - loss: 0.2258 - tp: 251.0000 - fp: 5696.0000 - tn: 176288.0000 - fn: 41.0000 - accuracy: 0.9685 - precision: 0.0422 - recall: 0.8596 - auc: 0.9725 - prc: 0.2237 - val_loss: 0.0739 - val_tp: 84.0000 - val_fp: 818.0000 - val_tn: 44658.0000 - val_fn: 9.0000 - val_accuracy: 0.9819 - val_precision: 0.0931 - val_recall: 0.9032 - val_auc: 0.9821 - val_prc: 0.5640
Restoring model weights from the end of the best epoch.
Epoch 00016: early stopping

トレーニング履歴を確認する

plot_metrics(weighted_history)

png

指標を評価する

train_predictions_weighted = weighted_model.predict(train_features, batch_size=BATCH_SIZE)
test_predictions_weighted = weighted_model.predict(test_features, batch_size=BATCH_SIZE)
weighted_results = weighted_model.evaluate(test_features, test_labels,
                                           batch_size=BATCH_SIZE, verbose=0)
for name, value in zip(weighted_model.metrics_names, weighted_results):
  print(name, ': ', value)
print()

plot_cm(test_labels, test_predictions_weighted)
loss :  0.014164570719003677
tp :  87.0
fp :  66.0
tn :  56789.0
fn :  20.0
accuracy :  0.9984902143478394
precision :  0.5686274766921997
recall :  0.8130841255187988
auc :  0.9428448677062988
prc :  0.7415629029273987

Legitimate Transactions Detected (True Negatives):  56789
Legitimate Transactions Incorrectly Detected (False Positives):  66
Fraudulent Transactions Missed (False Negatives):  20
Fraudulent Transactions Detected (True Positives):  87
Total Fraudulent Transactions:  107

png

ここでは、クラスの重みを使用すると、誤検出が多いため精度と精度が低くなりますが、逆に、モデルでも真の検出が多く検出されるため、再現率とAUCが高くなります。精度は低くなりますが、このモデルの再現率は高くなります(そして、より多くの不正なトランザクションを識別します)。もちろん、両方のタイプのエラーにはコストがかかります(正当なトランザクションが多すぎることを不正としてフラグ付けして、ユーザーにバグを報告したくない場合もあります)。アプリケーションのこれらのさまざまなタイプのエラー間のトレードオフを慎重に検討してください。

ROCをプロットする

plot_roc("Train Baseline", train_labels, train_predictions_baseline, color=colors[0])
plot_roc("Test Baseline", test_labels, test_predictions_baseline, color=colors[0], linestyle='--')

plot_roc("Train Weighted", train_labels, train_predictions_weighted, color=colors[1])
plot_roc("Test Weighted", test_labels, test_predictions_weighted, color=colors[1], linestyle='--')


plt.legend(loc='lower right')
<matplotlib.legend.Legend at 0x7f7144242b10>

png

AUPRCをプロットします

plot_prc("Train Baseline", train_labels, train_predictions_baseline, color=colors[0])
plot_prc("Test Baseline", test_labels, test_predictions_baseline, color=colors[0], linestyle='--')

plot_prc("Train Weighted", train_labels, train_predictions_weighted, color=colors[1])
plot_prc("Test Weighted", test_labels, test_predictions_weighted, color=colors[1], linestyle='--')


plt.legend(loc='lower right')
<matplotlib.legend.Legend at 0x7f751453b910>

png

オーバーサンプリング

少数派クラスをオーバーサンプリングする

関連するアプローチは、少数派クラスをオーバーサンプリングすることによってデータセットをリサンプリングすることです。

pos_features = train_features[bool_train_labels]
neg_features = train_features[~bool_train_labels]

pos_labels = train_labels[bool_train_labels]
neg_labels = train_labels[~bool_train_labels]

NumPyの使用

肯定的な例から適切な数のランダムインデックスを選択することにより、データセットのバランスを手動でとることができます。

ids = np.arange(len(pos_features))
choices = np.random.choice(ids, len(neg_features))

res_pos_features = pos_features[choices]
res_pos_labels = pos_labels[choices]

res_pos_features.shape
(181984, 29)
resampled_features = np.concatenate([res_pos_features, neg_features], axis=0)
resampled_labels = np.concatenate([res_pos_labels, neg_labels], axis=0)

order = np.arange(len(resampled_labels))
np.random.shuffle(order)
resampled_features = resampled_features[order]
resampled_labels = resampled_labels[order]

resampled_features.shape
(363968, 29)

使用tf.data

あなたが使用している場合tf.dataバランスの例を生成する最も簡単な方法で開始することでpositivenegativeデータセット、およびそれらをマージします。参照してくださいtf.dataガイドより多くの例については、を。

BUFFER_SIZE = 100000

def make_ds(features, labels):
  ds = tf.data.Dataset.from_tensor_slices((features, labels))#.cache()
  ds = ds.shuffle(BUFFER_SIZE).repeat()
  return ds

pos_ds = make_ds(pos_features, pos_labels)
neg_ds = make_ds(neg_features, neg_labels)

各データセットは、提供(feature, label)のペアを:

for features, label in pos_ds.take(1):
  print("Features:\n", features.numpy())
  print()
  print("Label: ", label.numpy())
Features:
 [-1.04413116 -0.69043781 -0.75372405  0.76864679 -0.00923547  0.31793958
  3.07078322 -0.95876958 -0.05637099  1.23484144  1.01081771 -0.7284406
 -1.54086495  0.34734429  0.32568171 -0.49141038 -0.98458939  0.58257181
  1.0761878   0.04288066 -0.42865418  0.79503112  0.79045437  1.25049344
 -0.27659652 -1.24646206  1.26815293 -0.2826666   1.73542639]

Label:  1

2一緒に使ってマージexperimental.sample_from_datasets

resampled_ds = tf.data.experimental.sample_from_datasets([pos_ds, neg_ds], weights=[0.5, 0.5])
resampled_ds = resampled_ds.batch(BATCH_SIZE).prefetch(2)
for features, label in resampled_ds.take(1):
  print(label.numpy().mean())
0.50830078125

このデータセットを使用するには、エポックごとのステップ数が必要です。

この場合の「エポック」の定義はあまり明確ではありません。それぞれのネガティブな例を1回見るのに必要なバッチの数だとしましょう。

resampled_steps_per_epoch = np.ceil(2.0*neg/BATCH_SIZE)
resampled_steps_per_epoch
278.0

オーバーサンプリングされたデータでトレーニングする

ここで、クラスの重みを使用する代わりに、リサンプリングされたデータセットを使用してモデルをトレーニングして、これらのメソッドがどのように比較されるかを確認してください。

resampled_model = make_model()
resampled_model.load_weights(initial_weights)

# Reset the bias to zero, since this dataset is balanced.
output_layer = resampled_model.layers[-1] 
output_layer.bias.assign([0])

val_ds = tf.data.Dataset.from_tensor_slices((val_features, val_labels)).cache()
val_ds = val_ds.batch(BATCH_SIZE).prefetch(2) 

resampled_history = resampled_model.fit(
    resampled_ds,
    epochs=EPOCHS,
    steps_per_epoch=resampled_steps_per_epoch,
    callbacks=[early_stopping],
    validation_data=val_ds)
Epoch 1/100
278/278 [==============================] - 10s 30ms/step - loss: 0.3890 - tp: 238572.0000 - fp: 54563.0000 - tn: 286818.0000 - fn: 46353.0000 - accuracy: 0.8389 - precision: 0.8139 - recall: 0.8373 - auc: 0.9157 - prc: 0.9306 - val_loss: 0.2080 - val_tp: 84.0000 - val_fp: 1191.0000 - val_tn: 44285.0000 - val_fn: 9.0000 - val_accuracy: 0.9737 - val_precision: 0.0659 - val_recall: 0.9032 - val_auc: 0.9735 - val_prc: 0.7107
Epoch 2/100
278/278 [==============================] - 7s 27ms/step - loss: 0.1865 - tp: 260959.0000 - fp: 16763.0000 - tn: 268148.0000 - fn: 23474.0000 - accuracy: 0.9293 - precision: 0.9396 - recall: 0.9175 - auc: 0.9801 - prc: 0.9831 - val_loss: 0.1075 - val_tp: 84.0000 - val_fp: 883.0000 - val_tn: 44593.0000 - val_fn: 9.0000 - val_accuracy: 0.9804 - val_precision: 0.0869 - val_recall: 0.9032 - val_auc: 0.9754 - val_prc: 0.7344
Epoch 3/100
278/278 [==============================] - 7s 26ms/step - loss: 0.1408 - tp: 265071.0000 - fp: 11139.0000 - tn: 273858.0000 - fn: 19276.0000 - accuracy: 0.9466 - precision: 0.9597 - recall: 0.9322 - auc: 0.9893 - prc: 0.9901 - val_loss: 0.0784 - val_tp: 84.0000 - val_fp: 820.0000 - val_tn: 44656.0000 - val_fn: 9.0000 - val_accuracy: 0.9818 - val_precision: 0.0929 - val_recall: 0.9032 - val_auc: 0.9758 - val_prc: 0.7237
Epoch 4/100
278/278 [==============================] - 7s 26ms/step - loss: 0.1185 - tp: 267402.0000 - fp: 9240.0000 - tn: 275635.0000 - fn: 17067.0000 - accuracy: 0.9538 - precision: 0.9666 - recall: 0.9400 - auc: 0.9927 - prc: 0.9929 - val_loss: 0.0646 - val_tp: 84.0000 - val_fp: 754.0000 - val_tn: 44722.0000 - val_fn: 9.0000 - val_accuracy: 0.9833 - val_precision: 0.1002 - val_recall: 0.9032 - val_auc: 0.9760 - val_prc: 0.7261
Epoch 5/100
278/278 [==============================] - 7s 26ms/step - loss: 0.1052 - tp: 269706.0000 - fp: 8201.0000 - tn: 276246.0000 - fn: 15191.0000 - accuracy: 0.9589 - precision: 0.9705 - recall: 0.9467 - auc: 0.9943 - prc: 0.9943 - val_loss: 0.0557 - val_tp: 84.0000 - val_fp: 715.0000 - val_tn: 44761.0000 - val_fn: 9.0000 - val_accuracy: 0.9841 - val_precision: 0.1051 - val_recall: 0.9032 - val_auc: 0.9775 - val_prc: 0.7051
Epoch 6/100
278/278 [==============================] - 7s 27ms/step - loss: 0.0955 - tp: 270846.0000 - fp: 7544.0000 - tn: 277248.0000 - fn: 13706.0000 - accuracy: 0.9627 - precision: 0.9729 - recall: 0.9518 - auc: 0.9953 - prc: 0.9952 - val_loss: 0.0505 - val_tp: 84.0000 - val_fp: 686.0000 - val_tn: 44790.0000 - val_fn: 9.0000 - val_accuracy: 0.9847 - val_precision: 0.1091 - val_recall: 0.9032 - val_auc: 0.9780 - val_prc: 0.6841
Epoch 7/100
278/278 [==============================] - 8s 27ms/step - loss: 0.0890 - tp: 271051.0000 - fp: 7252.0000 - tn: 278264.0000 - fn: 12777.0000 - accuracy: 0.9648 - precision: 0.9739 - recall: 0.9550 - auc: 0.9960 - prc: 0.9957 - val_loss: 0.0463 - val_tp: 84.0000 - val_fp: 637.0000 - val_tn: 44839.0000 - val_fn: 9.0000 - val_accuracy: 0.9858 - val_precision: 0.1165 - val_recall: 0.9032 - val_auc: 0.9784 - val_prc: 0.6775
Epoch 8/100
278/278 [==============================] - 7s 27ms/step - loss: 0.0833 - tp: 273018.0000 - fp: 6871.0000 - tn: 277463.0000 - fn: 11992.0000 - accuracy: 0.9669 - precision: 0.9755 - recall: 0.9579 - auc: 0.9964 - prc: 0.9962 - val_loss: 0.0442 - val_tp: 84.0000 - val_fp: 622.0000 - val_tn: 44854.0000 - val_fn: 9.0000 - val_accuracy: 0.9862 - val_precision: 0.1190 - val_recall: 0.9032 - val_auc: 0.9784 - val_prc: 0.6667
Epoch 9/100
278/278 [==============================] - 7s 27ms/step - loss: 0.0785 - tp: 274595.0000 - fp: 6700.0000 - tn: 277213.0000 - fn: 10836.0000 - accuracy: 0.9692 - precision: 0.9762 - recall: 0.9620 - auc: 0.9968 - prc: 0.9965 - val_loss: 0.0416 - val_tp: 83.0000 - val_fp: 593.0000 - val_tn: 44883.0000 - val_fn: 10.0000 - val_accuracy: 0.9868 - val_precision: 0.1228 - val_recall: 0.8925 - val_auc: 0.9787 - val_prc: 0.6603
Epoch 10/100
278/278 [==============================] - 7s 26ms/step - loss: 0.0741 - tp: 275035.0000 - fp: 6506.0000 - tn: 277749.0000 - fn: 10054.0000 - accuracy: 0.9709 - precision: 0.9769 - recall: 0.9647 - auc: 0.9971 - prc: 0.9967 - val_loss: 0.0396 - val_tp: 83.0000 - val_fp: 577.0000 - val_tn: 44899.0000 - val_fn: 10.0000 - val_accuracy: 0.9871 - val_precision: 0.1258 - val_recall: 0.8925 - val_auc: 0.9747 - val_prc: 0.6611
Epoch 11/100
278/278 [==============================] - 7s 25ms/step - loss: 0.0710 - tp: 275325.0000 - fp: 6260.0000 - tn: 278414.0000 - fn: 9345.0000 - accuracy: 0.9726 - precision: 0.9778 - recall: 0.9672 - auc: 0.9973 - prc: 0.9970 - val_loss: 0.0376 - val_tp: 83.0000 - val_fp: 542.0000 - val_tn: 44934.0000 - val_fn: 10.0000 - val_accuracy: 0.9879 - val_precision: 0.1328 - val_recall: 0.8925 - val_auc: 0.9661 - val_prc: 0.6674
Epoch 12/100
278/278 [==============================] - 7s 26ms/step - loss: 0.0679 - tp: 276700.0000 - fp: 6131.0000 - tn: 278197.0000 - fn: 8316.0000 - accuracy: 0.9746 - precision: 0.9783 - recall: 0.9708 - auc: 0.9975 - prc: 0.9971 - val_loss: 0.0352 - val_tp: 83.0000 - val_fp: 513.0000 - val_tn: 44963.0000 - val_fn: 10.0000 - val_accuracy: 0.9885 - val_precision: 0.1393 - val_recall: 0.8925 - val_auc: 0.9667 - val_prc: 0.6673
Restoring model weights from the end of the best epoch.
Epoch 00012: early stopping

トレーニングプロセスが各勾配更新でデータセット全体を考慮している場合、このオーバーサンプリングは基本的にクラス均等化と同じになります。

ただし、ここで行ったように、モデルをバッチごとにトレーニングすると、オーバーサンプリングされたデータはより滑らかな勾配信号を提供します。各正の例が大きな重みで1つのバッチに表示される代わりに、毎回多くの異なるバッチで表示されます。小さい重量。

この滑らかな勾配信号により、モデルのトレーニングが容易になります。

トレーニング履歴を確認する

トレーニングデータの分布は検証データやテストデータとはまったく異なるため、ここではメトリックの分布が異なることに注意してください。

plot_metrics(resampled_history)

png

再トレーニング

バランスの取れたデータではトレーニングが簡単であるため、上記のトレーニング手順はすぐに過剰適合する可能性があります。

だから、与えることエポックを破るtf.keras.callbacks.EarlyStoppingトレーニングを停止するときを細かく制御します。

resampled_model = make_model()
resampled_model.load_weights(initial_weights)

# Reset the bias to zero, since this dataset is balanced.
output_layer = resampled_model.layers[-1] 
output_layer.bias.assign([0])

resampled_history = resampled_model.fit(
    resampled_ds,
    # These are not real epochs
    steps_per_epoch=20,
    epochs=10*EPOCHS,
    callbacks=[early_stopping],
    validation_data=(val_ds))
Epoch 1/1000
20/20 [==============================] - 3s 69ms/step - loss: 1.0239 - tp: 8478.0000 - fp: 5972.0000 - tn: 59984.0000 - fn: 12095.0000 - accuracy: 0.7912 - precision: 0.5867 - recall: 0.4121 - auc: 0.8196 - prc: 0.6214 - val_loss: 0.4766 - val_tp: 66.0000 - val_fp: 5871.0000 - val_tn: 39605.0000 - val_fn: 27.0000 - val_accuracy: 0.8706 - val_precision: 0.0111 - val_recall: 0.7097 - val_auc: 0.8387 - val_prc: 0.3138
Epoch 2/1000
20/20 [==============================] - 1s 31ms/step - loss: 0.5904 - tp: 14300.0000 - fp: 6165.0000 - tn: 14157.0000 - fn: 6338.0000 - accuracy: 0.6948 - precision: 0.6988 - recall: 0.6929 - auc: 0.7598 - prc: 0.8344 - val_loss: 0.4917 - val_tp: 79.0000 - val_fp: 6943.0000 - val_tn: 38533.0000 - val_fn: 14.0000 - val_accuracy: 0.8473 - val_precision: 0.0113 - val_recall: 0.8495 - val_auc: 0.9217 - val_prc: 0.6175
Epoch 3/1000
20/20 [==============================] - 1s 31ms/step - loss: 0.4641 - tp: 16675.0000 - fp: 6478.0000 - tn: 14174.0000 - fn: 3633.0000 - accuracy: 0.7531 - precision: 0.7202 - recall: 0.8211 - auc: 0.8600 - prc: 0.9030 - val_loss: 0.4731 - val_tp: 81.0000 - val_fp: 6295.0000 - val_tn: 39181.0000 - val_fn: 12.0000 - val_accuracy: 0.8616 - val_precision: 0.0127 - val_recall: 0.8710 - val_auc: 0.9385 - val_prc: 0.6454
Epoch 4/1000
20/20 [==============================] - 1s 31ms/step - loss: 0.4083 - tp: 17584.0000 - fp: 5691.0000 - tn: 14793.0000 - fn: 2892.0000 - accuracy: 0.7905 - precision: 0.7555 - recall: 0.8588 - auc: 0.8976 - prc: 0.9293 - val_loss: 0.4422 - val_tp: 82.0000 - val_fp: 5087.0000 - val_tn: 40389.0000 - val_fn: 11.0000 - val_accuracy: 0.8881 - val_precision: 0.0159 - val_recall: 0.8817 - val_auc: 0.9490 - val_prc: 0.6572
Epoch 5/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.3798 - tp: 17815.0000 - fp: 5030.0000 - tn: 15490.0000 - fn: 2625.0000 - accuracy: 0.8131 - precision: 0.7798 - recall: 0.8716 - auc: 0.9132 - prc: 0.9397 - val_loss: 0.4100 - val_tp: 81.0000 - val_fp: 3957.0000 - val_tn: 41519.0000 - val_fn: 12.0000 - val_accuracy: 0.9129 - val_precision: 0.0201 - val_recall: 0.8710 - val_auc: 0.9562 - val_prc: 0.6596
Epoch 6/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.3494 - tp: 18360.0000 - fp: 4423.0000 - tn: 15777.0000 - fn: 2400.0000 - accuracy: 0.8334 - precision: 0.8059 - recall: 0.8844 - auc: 0.9275 - prc: 0.9502 - val_loss: 0.3790 - val_tp: 81.0000 - val_fp: 3133.0000 - val_tn: 42343.0000 - val_fn: 12.0000 - val_accuracy: 0.9310 - val_precision: 0.0252 - val_recall: 0.8710 - val_auc: 0.9611 - val_prc: 0.6736
Epoch 7/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.3307 - tp: 18214.0000 - fp: 3941.0000 - tn: 16539.0000 - fn: 2266.0000 - accuracy: 0.8485 - precision: 0.8221 - recall: 0.8894 - auc: 0.9351 - prc: 0.9541 - val_loss: 0.3493 - val_tp: 82.0000 - val_fp: 2454.0000 - val_tn: 43022.0000 - val_fn: 11.0000 - val_accuracy: 0.9459 - val_precision: 0.0323 - val_recall: 0.8817 - val_auc: 0.9649 - val_prc: 0.6827
Epoch 8/1000
20/20 [==============================] - 1s 31ms/step - loss: 0.3134 - tp: 18246.0000 - fp: 3355.0000 - tn: 17195.0000 - fn: 2164.0000 - accuracy: 0.8653 - precision: 0.8447 - recall: 0.8940 - auc: 0.9420 - prc: 0.9584 - val_loss: 0.3222 - val_tp: 82.0000 - val_fp: 1976.0000 - val_tn: 43500.0000 - val_fn: 11.0000 - val_accuracy: 0.9564 - val_precision: 0.0398 - val_recall: 0.8817 - val_auc: 0.9676 - val_prc: 0.6907
Epoch 9/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.2960 - tp: 18249.0000 - fp: 3043.0000 - tn: 17504.0000 - fn: 2164.0000 - accuracy: 0.8729 - precision: 0.8571 - recall: 0.8940 - auc: 0.9473 - prc: 0.9618 - val_loss: 0.2977 - val_tp: 84.0000 - val_fp: 1734.0000 - val_tn: 43742.0000 - val_fn: 9.0000 - val_accuracy: 0.9618 - val_precision: 0.0462 - val_recall: 0.9032 - val_auc: 0.9694 - val_prc: 0.6985
Epoch 10/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.2790 - tp: 18408.0000 - fp: 2643.0000 - tn: 17812.0000 - fn: 2097.0000 - accuracy: 0.8843 - precision: 0.8744 - recall: 0.8977 - auc: 0.9531 - prc: 0.9655 - val_loss: 0.2753 - val_tp: 84.0000 - val_fp: 1542.0000 - val_tn: 43934.0000 - val_fn: 9.0000 - val_accuracy: 0.9660 - val_precision: 0.0517 - val_recall: 0.9032 - val_auc: 0.9712 - val_prc: 0.7014
Epoch 11/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.2674 - tp: 18380.0000 - fp: 2376.0000 - tn: 18191.0000 - fn: 2013.0000 - accuracy: 0.8928 - precision: 0.8855 - recall: 0.9013 - auc: 0.9570 - prc: 0.9678 - val_loss: 0.2550 - val_tp: 84.0000 - val_fp: 1413.0000 - val_tn: 44063.0000 - val_fn: 9.0000 - val_accuracy: 0.9688 - val_precision: 0.0561 - val_recall: 0.9032 - val_auc: 0.9721 - val_prc: 0.6979
Epoch 12/1000
20/20 [==============================] - 1s 31ms/step - loss: 0.2534 - tp: 18500.0000 - fp: 2204.0000 - tn: 18338.0000 - fn: 1918.0000 - accuracy: 0.8994 - precision: 0.8935 - recall: 0.9061 - auc: 0.9625 - prc: 0.9713 - val_loss: 0.2358 - val_tp: 84.0000 - val_fp: 1299.0000 - val_tn: 44177.0000 - val_fn: 9.0000 - val_accuracy: 0.9713 - val_precision: 0.0607 - val_recall: 0.9032 - val_auc: 0.9725 - val_prc: 0.7011
Epoch 13/1000
20/20 [==============================] - 1s 31ms/step - loss: 0.2432 - tp: 18491.0000 - fp: 1982.0000 - tn: 18622.0000 - fn: 1865.0000 - accuracy: 0.9061 - precision: 0.9032 - recall: 0.9084 - auc: 0.9651 - prc: 0.9731 - val_loss: 0.2188 - val_tp: 84.0000 - val_fp: 1230.0000 - val_tn: 44246.0000 - val_fn: 9.0000 - val_accuracy: 0.9728 - val_precision: 0.0639 - val_recall: 0.9032 - val_auc: 0.9732 - val_prc: 0.7048
Epoch 14/1000
20/20 [==============================] - 1s 31ms/step - loss: 0.2319 - tp: 18537.0000 - fp: 1749.0000 - tn: 18805.0000 - fn: 1869.0000 - accuracy: 0.9117 - precision: 0.9138 - recall: 0.9084 - auc: 0.9680 - prc: 0.9748 - val_loss: 0.2043 - val_tp: 84.0000 - val_fp: 1161.0000 - val_tn: 44315.0000 - val_fn: 9.0000 - val_accuracy: 0.9743 - val_precision: 0.0675 - val_recall: 0.9032 - val_auc: 0.9736 - val_prc: 0.7107
Epoch 15/1000
20/20 [==============================] - 1s 31ms/step - loss: 0.2229 - tp: 18660.0000 - fp: 1664.0000 - tn: 18815.0000 - fn: 1821.0000 - accuracy: 0.9149 - precision: 0.9181 - recall: 0.9111 - auc: 0.9709 - prc: 0.9770 - val_loss: 0.1910 - val_tp: 84.0000 - val_fp: 1108.0000 - val_tn: 44368.0000 - val_fn: 9.0000 - val_accuracy: 0.9755 - val_precision: 0.0705 - val_recall: 0.9032 - val_auc: 0.9738 - val_prc: 0.7126
Epoch 16/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.2154 - tp: 18720.0000 - fp: 1545.0000 - tn: 18926.0000 - fn: 1769.0000 - accuracy: 0.9191 - precision: 0.9238 - recall: 0.9137 - auc: 0.9730 - prc: 0.9782 - val_loss: 0.1795 - val_tp: 84.0000 - val_fp: 1076.0000 - val_tn: 44400.0000 - val_fn: 9.0000 - val_accuracy: 0.9762 - val_precision: 0.0724 - val_recall: 0.9032 - val_auc: 0.9744 - val_prc: 0.7088
Epoch 17/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.2098 - tp: 18795.0000 - fp: 1427.0000 - tn: 18902.0000 - fn: 1836.0000 - accuracy: 0.9203 - precision: 0.9294 - recall: 0.9110 - auc: 0.9742 - prc: 0.9793 - val_loss: 0.1693 - val_tp: 84.0000 - val_fp: 1049.0000 - val_tn: 44427.0000 - val_fn: 9.0000 - val_accuracy: 0.9768 - val_precision: 0.0741 - val_recall: 0.9032 - val_auc: 0.9747 - val_prc: 0.7129
Epoch 18/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.2013 - tp: 18839.0000 - fp: 1345.0000 - tn: 19037.0000 - fn: 1739.0000 - accuracy: 0.9247 - precision: 0.9334 - recall: 0.9155 - auc: 0.9765 - prc: 0.9810 - val_loss: 0.1605 - val_tp: 84.0000 - val_fp: 1026.0000 - val_tn: 44450.0000 - val_fn: 9.0000 - val_accuracy: 0.9773 - val_precision: 0.0757 - val_recall: 0.9032 - val_auc: 0.9751 - val_prc: 0.7197
Epoch 19/1000
20/20 [==============================] - 1s 29ms/step - loss: 0.1948 - tp: 18674.0000 - fp: 1262.0000 - tn: 19278.0000 - fn: 1746.0000 - accuracy: 0.9266 - precision: 0.9367 - recall: 0.9145 - auc: 0.9780 - prc: 0.9814 - val_loss: 0.1521 - val_tp: 84.0000 - val_fp: 1001.0000 - val_tn: 44475.0000 - val_fn: 9.0000 - val_accuracy: 0.9778 - val_precision: 0.0774 - val_recall: 0.9032 - val_auc: 0.9752 - val_prc: 0.7222
Epoch 20/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.1897 - tp: 18894.0000 - fp: 1218.0000 - tn: 19132.0000 - fn: 1716.0000 - accuracy: 0.9284 - precision: 0.9394 - recall: 0.9167 - auc: 0.9795 - prc: 0.9828 - val_loss: 0.1453 - val_tp: 84.0000 - val_fp: 1001.0000 - val_tn: 44475.0000 - val_fn: 9.0000 - val_accuracy: 0.9778 - val_precision: 0.0774 - val_recall: 0.9032 - val_auc: 0.9752 - val_prc: 0.7256
Epoch 21/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.1832 - tp: 18874.0000 - fp: 1117.0000 - tn: 19334.0000 - fn: 1635.0000 - accuracy: 0.9328 - precision: 0.9441 - recall: 0.9203 - auc: 0.9807 - prc: 0.9837 - val_loss: 0.1388 - val_tp: 84.0000 - val_fp: 992.0000 - val_tn: 44484.0000 - val_fn: 9.0000 - val_accuracy: 0.9780 - val_precision: 0.0781 - val_recall: 0.9032 - val_auc: 0.9753 - val_prc: 0.7259
Epoch 22/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.1820 - tp: 18659.0000 - fp: 1231.0000 - tn: 19399.0000 - fn: 1671.0000 - accuracy: 0.9292 - precision: 0.9381 - recall: 0.9178 - auc: 0.9813 - prc: 0.9837 - val_loss: 0.1322 - val_tp: 84.0000 - val_fp: 959.0000 - val_tn: 44517.0000 - val_fn: 9.0000 - val_accuracy: 0.9788 - val_precision: 0.0805 - val_recall: 0.9032 - val_auc: 0.9753 - val_prc: 0.7304
Epoch 23/1000
20/20 [==============================] - 1s 29ms/step - loss: 0.1781 - tp: 18684.0000 - fp: 1080.0000 - tn: 19526.0000 - fn: 1670.0000 - accuracy: 0.9329 - precision: 0.9454 - recall: 0.9180 - auc: 0.9819 - prc: 0.9841 - val_loss: 0.1258 - val_tp: 84.0000 - val_fp: 931.0000 - val_tn: 44545.0000 - val_fn: 9.0000 - val_accuracy: 0.9794 - val_precision: 0.0828 - val_recall: 0.9032 - val_auc: 0.9752 - val_prc: 0.7312
Epoch 24/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.1713 - tp: 18700.0000 - fp: 954.0000 - tn: 19691.0000 - fn: 1615.0000 - accuracy: 0.9373 - precision: 0.9515 - recall: 0.9205 - auc: 0.9834 - prc: 0.9854 - val_loss: 0.1209 - val_tp: 84.0000 - val_fp: 905.0000 - val_tn: 44571.0000 - val_fn: 9.0000 - val_accuracy: 0.9799 - val_precision: 0.0849 - val_recall: 0.9032 - val_auc: 0.9754 - val_prc: 0.7315
Epoch 25/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.1660 - tp: 18755.0000 - fp: 954.0000 - tn: 19675.0000 - fn: 1576.0000 - accuracy: 0.9382 - precision: 0.9516 - recall: 0.9225 - auc: 0.9846 - prc: 0.9864 - val_loss: 0.1169 - val_tp: 84.0000 - val_fp: 904.0000 - val_tn: 44572.0000 - val_fn: 9.0000 - val_accuracy: 0.9800 - val_precision: 0.0850 - val_recall: 0.9032 - val_auc: 0.9749 - val_prc: 0.7316
Epoch 26/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.1653 - tp: 18864.0000 - fp: 993.0000 - tn: 19548.0000 - fn: 1555.0000 - accuracy: 0.9378 - precision: 0.9500 - recall: 0.9238 - auc: 0.9847 - prc: 0.9864 - val_loss: 0.1129 - val_tp: 84.0000 - val_fp: 892.0000 - val_tn: 44584.0000 - val_fn: 9.0000 - val_accuracy: 0.9802 - val_precision: 0.0861 - val_recall: 0.9032 - val_auc: 0.9751 - val_prc: 0.7337
Epoch 27/1000
20/20 [==============================] - 1s 29ms/step - loss: 0.1606 - tp: 19038.0000 - fp: 915.0000 - tn: 19433.0000 - fn: 1574.0000 - accuracy: 0.9392 - precision: 0.9541 - recall: 0.9236 - auc: 0.9855 - prc: 0.9873 - val_loss: 0.1096 - val_tp: 84.0000 - val_fp: 879.0000 - val_tn: 44597.0000 - val_fn: 9.0000 - val_accuracy: 0.9805 - val_precision: 0.0872 - val_recall: 0.9032 - val_auc: 0.9759 - val_prc: 0.7339
Epoch 28/1000
20/20 [==============================] - 1s 30ms/step - loss: 0.1579 - tp: 18794.0000 - fp: 885.0000 - tn: 19743.0000 - fn: 1538.0000 - accuracy: 0.9408 - precision: 0.9550 - recall: 0.9244 - auc: 0.9862 - prc: 0.9876 - val_loss: 0.1064 - val_tp: 84.0000 - val_fp: 876.0000 - val_tn: 44600.0000 - val_fn: 9.0000 - val_accuracy: 0.9806 - val_precision: 0.0875 - val_recall: 0.9032 - val_auc: 0.9760 - val_prc: 0.7341
Epoch 29/1000
20/20 [==============================] - 1s 32ms/step - loss: 0.1528 - tp: 18763.0000 - fp: 844.0000 - tn: 19832.0000 - fn: 1521.0000 - accuracy: 0.9423 - precision: 0.9570 - recall: 0.9250 - auc: 0.9872 - prc: 0.9884 - val_loss: 0.1035 - val_tp: 84.0000 - val_fp: 882.0000 - val_tn: 44594.0000 - val_fn: 9.0000 - val_accuracy: 0.9804 - val_precision: 0.0870 - val_recall: 0.9032 - val_auc: 0.9754 - val_prc: 0.7351
Epoch 30/1000
20/20 [==============================] - 1s 31ms/step - loss: 0.1525 - tp: 18954.0000 - fp: 892.0000 - tn: 19623.0000 - fn: 1491.0000 - accuracy: 0.9418 - precision: 0.9551 - recall: 0.9271 - auc: 0.9873 - prc: 0.9885 - val_loss: 0.1005 - val_tp: 84.0000 - val_fp: 874.0000 - val_tn: 44602.0000 - val_fn: 9.0000 - val_accuracy: 0.9806 - val_precision: 0.0877 - val_recall: 0.9032 - val_auc: 0.9761 - val_prc: 0.7367
Epoch 31/1000
20/20 [==============================] - 1s 32ms/step - loss: 0.1508 - tp: 18944.0000 - fp: 900.0000 - tn: 19604.0000 - fn: 1512.0000 - accuracy: 0.9411 - precision: 0.9546 - recall: 0.9261 - auc: 0.9873 - prc: 0.9885 - val_loss: 0.0973 - val_tp: 84.0000 - val_fp: 856.0000 - val_tn: 44620.0000 - val_fn: 9.0000 - val_accuracy: 0.9810 - val_precision: 0.0894 - val_recall: 0.9032 - val_auc: 0.9761 - val_prc: 0.7286
Epoch 32/1000
20/20 [==============================] - 1s 33ms/step - loss: 0.1484 - tp: 18926.0000 - fp: 827.0000 - tn: 19758.0000 - fn: 1449.0000 - accuracy: 0.9444 - precision: 0.9581 - recall: 0.9289 - auc: 0.9878 - prc: 0.9887 - val_loss: 0.0953 - val_tp: 84.0000 - val_fp: 856.0000 - val_tn: 44620.0000 - val_fn: 9.0000 - val_accuracy: 0.9810 - val_precision: 0.0894 - val_recall: 0.9032 - val_auc: 0.9766 - val_prc: 0.7289
Epoch 33/1000
20/20 [==============================] - 1s 32ms/step - loss: 0.1443 - tp: 19024.0000 - fp: 803.0000 - tn: 19677.0000 - fn: 1456.0000 - accuracy: 0.9448 - precision: 0.9595 - recall: 0.9289 - auc: 0.9885 - prc: 0.9895 - val_loss: 0.0931 - val_tp: 84.0000 - val_fp: 856.0000 - val_tn: 44620.0000 - val_fn: 9.0000 - val_accuracy: 0.9810 - val_precision: 0.0894 - val_recall: 0.9032 - val_auc: 0.9770 - val_prc: 0.7290
Epoch 34/1000
20/20 [==============================] - 1s 32ms/step - loss: 0.1457 - tp: 19130.0000 - fp: 865.0000 - tn: 19555.0000 - fn: 1410.0000 - accuracy: 0.9445 - precision: 0.9567 - recall: 0.9314 - auc: 0.9886 - prc: 0.9895 - val_loss: 0.0910 - val_tp: 84.0000 - val_fp: 857.0000 - val_tn: 44619.0000 - val_fn: 9.0000 - val_accuracy: 0.9810 - val_precision: 0.0893 - val_recall: 0.9032 - val_auc: 0.9770 - val_prc: 0.7291
Epoch 35/1000
20/20 [==============================] - 1s 32ms/step - loss: 0.1418 - tp: 19237.0000 - fp: 810.0000 - tn: 19513.0000 - fn: 1400.0000 - accuracy: 0.9460 - precision: 0.9596 - recall: 0.9322 - auc: 0.9890 - prc: 0.9898 - val_loss: 0.0893 - val_tp: 84.0000 - val_fp: 842.0000 - val_tn: 44634.0000 - val_fn: 9.0000 - val_accuracy: 0.9813 - val_precision: 0.0907 - val_recall: 0.9032 - val_auc: 0.9759 - val_prc: 0.7293
Epoch 36/1000
20/20 [==============================] - 1s 31ms/step - loss: 0.1400 - tp: 19232.0000 - fp: 789.0000 - tn: 19576.0000 - fn: 1363.0000 - accuracy: 0.9475 - precision: 0.9606 - recall: 0.9338 - auc: 0.9894 - prc: 0.9902 - val_loss: 0.0874 - val_tp: 84.0000 - val_fp: 845.0000 - val_tn: 44631.0000 - val_fn: 9.0000 - val_accuracy: 0.9813 - val_precision: 0.0904 - val_recall: 0.9032 - val_auc: 0.9763 - val_prc: 0.7293
Epoch 37/1000
20/20 [==============================] - 1s 32ms/step - loss: 0.1385 - tp: 19219.0000 - fp: 784.0000 - tn: 19574.0000 - fn: 1383.0000 - accuracy: 0.9471 - precision: 0.9608 - recall: 0.9329 - auc: 0.9897 - prc: 0.9904 - val_loss: 0.0857 - val_tp: 84.0000 - val_fp: 846.0000 - val_tn: 44630.0000 - val_fn: 9.0000 - val_accuracy: 0.9812 - val_precision: 0.0903 - val_recall: 0.9032 - val_auc: 0.9758 - val_prc: 0.7218
Epoch 38/1000
20/20 [==============================] - 1s 31ms/step - loss: 0.1329 - tp: 19183.0000 - fp: 731.0000 - tn: 19701.0000 - fn: 1345.0000 - accuracy: 0.9493 - precision: 0.9633 - recall: 0.9345 - auc: 0.9907 - prc: 0.9913 - val_loss: 0.0840 - val_tp: 84.0000 - val_fp: 838.0000 - val_tn: 44638.0000 - val_fn: 9.0000 - val_accuracy: 0.9814 - val_precision: 0.0911 - val_recall: 0.9032 - val_auc: 0.9762 - val_prc: 0.7220
Epoch 39/1000
20/20 [==============================] - 1s 32ms/step - loss: 0.1314 - tp: 19074.0000 - fp: 765.0000 - tn: 19820.0000 - fn: 1301.0000 - accuracy: 0.9496 - precision: 0.9614 - recall: 0.9361 - auc: 0.9909 - prc: 0.9913 - val_loss: 0.0823 - val_tp: 84.0000 - val_fp: 839.0000 - val_tn: 44637.0000 - val_fn: 9.0000 - val_accuracy: 0.9814 - val_precision: 0.0910 - val_recall: 0.9032 - val_auc: 0.9765 - val_prc: 0.7226
Epoch 40/1000
20/20 [==============================] - 1s 31ms/step - loss: 0.1298 - tp: 19257.0000 - fp: 728.0000 - tn: 19690.0000 - fn: 1285.0000 - accuracy: 0.9509 - precision: 0.9636 - recall: 0.9374 - auc: 0.9911 - prc: 0.9917 - val_loss: 0.0809 - val_tp: 84.0000 - val_fp: 833.0000 - val_tn: 44643.0000 - val_fn: 9.0000 - val_accuracy: 0.9815 - val_precision: 0.0916 - val_recall: 0.9032 - val_auc: 0.9769 - val_prc: 0.7224
Restoring model weights from the end of the best epoch.
Epoch 00040: early stopping

トレーニング履歴を再確認してください

plot_metrics(resampled_history)

png

指標を評価する

train_predictions_resampled = resampled_model.predict(train_features, batch_size=BATCH_SIZE)
test_predictions_resampled = resampled_model.predict(test_features, batch_size=BATCH_SIZE)
resampled_results = resampled_model.evaluate(test_features, test_labels,
                                             batch_size=BATCH_SIZE, verbose=0)
for name, value in zip(resampled_model.metrics_names, resampled_results):
  print(name, ': ', value)
print()

plot_cm(test_labels, test_predictions_resampled)
loss :  0.10050631314516068
tp :  94.0
fp :  1165.0
tn :  55690.0
fn :  13.0
accuracy :  0.9793195724487305
precision :  0.07466243207454681
recall :  0.8785046935081482
auc :  0.9575912952423096
prc :  0.7548773884773254

Legitimate Transactions Detected (True Negatives):  55690
Legitimate Transactions Incorrectly Detected (False Positives):  1165
Fraudulent Transactions Missed (False Negatives):  13
Fraudulent Transactions Detected (True Positives):  94
Total Fraudulent Transactions:  107

png

ROCをプロットする

plot_roc("Train Baseline", train_labels, train_predictions_baseline, color=colors[0])
plot_roc("Test Baseline", test_labels, test_predictions_baseline, color=colors[0], linestyle='--')

plot_roc("Train Weighted", train_labels, train_predictions_weighted, color=colors[1])
plot_roc("Test Weighted", test_labels, test_predictions_weighted, color=colors[1], linestyle='--')

plot_roc("Train Resampled", train_labels, train_predictions_resampled, color=colors[2])
plot_roc("Test Resampled", test_labels, test_predictions_resampled, color=colors[2], linestyle='--')
plt.legend(loc='lower right')
<matplotlib.legend.Legend at 0x7f7134635bd0>

png

AUPRCをプロットします

plot_prc("Train Baseline", train_labels, train_predictions_baseline, color=colors[0])
plot_prc("Test Baseline", test_labels, test_predictions_baseline, color=colors[0], linestyle='--')

plot_prc("Train Weighted", train_labels, train_predictions_weighted, color=colors[1])
plot_prc("Test Weighted", test_labels, test_predictions_weighted, color=colors[1], linestyle='--')

plot_prc("Train Resampled", train_labels, train_predictions_resampled, color=colors[2])
plot_prc("Test Resampled", test_labels, test_predictions_resampled, color=colors[2], linestyle='--')
plt.legend(loc='lower right')
<matplotlib.legend.Legend at 0x7f75f6c1af90>

png

このチュートリアルを問題に適用する

学習するサンプルが非常に少ないため、不均衡なデータ分類は本質的に困難な作業です。常に最初にデータから始めて、できるだけ多くのサンプルを収集し、モデルがマイノリティクラスを最大限に活用できるように、関連する可能性のある機能について十分に検討するように最善を尽くす必要があります。ある時点で、モデルが必要な結果を改善して生成するのに苦労する可能性があるため、問題のコンテキストとさまざまなタイプのエラー間のトレードオフを覚えておくことが重要です。