עזרה להגן על שונית המחסום הגדולה עם TensorFlow על Kaggle הצטרפו אתגר

סיווג על נתונים לא מאוזנים

הצג באתר TensorFlow.org הפעל בגוגל קולאב צפה במקור ב-GitHub הורד מחברת

מדריך זה מדגים כיצד לסווג מערך נתונים מאוד לא מאוזן שבו מספר הדוגמאות במחלקה אחת עולה בהרבה על הדוגמאות במחלקה אחרת. אתה תעבוד עם מערך הנתונים לזיהוי הונאה בכרטיסי אשראי המתארח ב-Kaggle. המטרה היא לאתר רק 492 עסקאות הונאה מתוך 284,807 עסקאות בסך הכל. תשתמש ב- Keras כדי להגדיר את משקלי המודל והכיתה כדי לעזור למודל ללמוד מהנתונים הלא מאוזנים. .

מדריך זה מכיל קוד מלא ל:

  • טען קובץ CSV באמצעות Pandas.
  • צור ערכות רכבות, אימות ובדיקות.
  • הגדר והכשרת מודל באמצעות Keras (כולל הגדרת משקלי כיתות).
  • להעריך את המודל באמצעות מדדים שונים (כולל דיוק וזכירה).
  • נסה טכניקות נפוצות להתמודדות עם נתונים לא מאוזנים כמו:
    • שקלול כיתה
    • דגימת יתר

להכין

import tensorflow as tf
from tensorflow import keras

import os
import tempfile

import matplotlib as mpl
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
import seaborn as sns

import sklearn
from sklearn.metrics import confusion_matrix
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
mpl.rcParams['figure.figsize'] = (12, 10)
colors = plt.rcParams['axes.prop_cycle'].by_key()['color']

עיבוד נתונים וחקירה

הורד את ערכת הנתונים של הונאה בכרטיס אשראי של Kaggle

Pandas היא ספריית Python עם הרבה כלי עזר מועילים לטעינה ועבודה עם נתונים מובנים. ניתן להשתמש בו כדי להוריד קובצי CSV לתוך Pandas DataFrame .

file = tf.keras.utils
raw_df = pd.read_csv('https://storage.googleapis.com/download.tensorflow.org/data/creditcard.csv')
raw_df.head()
raw_df[['Time', 'V1', 'V2', 'V3', 'V4', 'V5', 'V26', 'V27', 'V28', 'Amount', 'Class']].describe()

בדוק את חוסר האיזון בתווית הכיתה

בואו נסתכל על חוסר האיזון של מערך הנתונים:

neg, pos = np.bincount(raw_df['Class'])
total = neg + pos
print('Examples:\n    Total: {}\n    Positive: {} ({:.2f}% of total)\n'.format(
    total, pos, 100 * pos / total))
Examples:
    Total: 284807
    Positive: 492 (0.17% of total)

זה מראה את החלק הקטן של דגימות חיוביות.

נקה, פצל ונרמל את הנתונים

לנתונים הגולמיים יש כמה בעיות. ראשית, העמודות Time Amount משתנות מכדי להשתמש בהן ישירות. שחרר את עמודת Time (מכיוון שלא ברור מה זה אומר) וקח את היומן של העמודה Amount כדי לצמצם את הטווח שלה.

cleaned_df = raw_df.copy()

# You don't want the `Time` column.
cleaned_df.pop('Time')

# The `Amount` column covers a huge range. Convert to log-space.
eps = 0.001 # 0 => 0.1¢
cleaned_df['Log Ammount'] = np.log(cleaned_df.pop('Amount')+eps)

פצל את מערך הנתונים לקבוצות הרכבה, אימות ובדיקה. ערכת האימות משמשת במהלך התאמת המודל כדי להעריך את האובדן ואת כל המדדים, אולם המודל אינו מתאים לנתונים אלו. מערך המבחנים אינו בשימוש לחלוטין בשלב ההכשרה ומשמש רק בסוף כדי להעריך עד כמה המודל מכליל לנתונים חדשים. זה חשוב במיוחד עם מערכי נתונים לא מאוזנים שבהם התאמת יתר היא דאגה משמעותית מהיעדר נתוני אימון.

# Use a utility from sklearn to split and shuffle your dataset.
train_df, test_df = train_test_split(cleaned_df, test_size=0.2)
train_df, val_df = train_test_split(train_df, test_size=0.2)

# Form np arrays of labels and features.
train_labels = np.array(train_df.pop('Class'))
bool_train_labels = train_labels != 0
val_labels = np.array(val_df.pop('Class'))
test_labels = np.array(test_df.pop('Class'))

train_features = np.array(train_df)
val_features = np.array(val_df)
test_features = np.array(test_df)

נרמל את תכונות הקלט באמצעות sklearn StandardScaler. זה יקבע את הממוצע ל-0 ואת סטיית התקן ל-1.

scaler = StandardScaler()
train_features = scaler.fit_transform(train_features)

val_features = scaler.transform(val_features)
test_features = scaler.transform(test_features)

train_features = np.clip(train_features, -5, 5)
val_features = np.clip(val_features, -5, 5)
test_features = np.clip(test_features, -5, 5)


print('Training labels shape:', train_labels.shape)
print('Validation labels shape:', val_labels.shape)
print('Test labels shape:', test_labels.shape)

print('Training features shape:', train_features.shape)
print('Validation features shape:', val_features.shape)
print('Test features shape:', test_features.shape)
Training labels shape: (182276,)
Validation labels shape: (45569,)
Test labels shape: (56962,)
Training features shape: (182276, 29)
Validation features shape: (45569, 29)
Test features shape: (56962, 29)

תסתכל על התפלגות הנתונים

לאחר מכן השווה את ההתפלגות של הדוגמאות החיוביות והשליליות על פני כמה תכונות. שאלות טובות שכדאי לשאול את עצמך בשלב זה הן:

  • האם ההפצות הללו הגיוניות?
    • כן. נרמלתם את הקלט ואלה מרוכזים בעיקר בטווח +/- 2 .
  • האם אתה יכול לראות את ההבדל בין ההפצות?
    • כן הדוגמאות החיוביות מכילות שיעור גבוה בהרבה של ערכים קיצוניים.
pos_df = pd.DataFrame(train_features[ bool_train_labels], columns=train_df.columns)
neg_df = pd.DataFrame(train_features[~bool_train_labels], columns=train_df.columns)

sns.jointplot(x=pos_df['V5'], y=pos_df['V6'],
              kind='hex', xlim=(-5,5), ylim=(-5,5))
plt.suptitle("Positive distribution")

sns.jointplot(x=neg_df['V5'], y=neg_df['V6'],
              kind='hex', xlim=(-5,5), ylim=(-5,5))
_ = plt.suptitle("Negative distribution")

png

png

הגדר את המודל והמדדים

הגדר פונקציה שיוצרת רשת עצבית פשוטה עם שכבה נסתרת המחוברת בצפיפות, שכבת נשירה להפחתת התאמה יתר, ושכבת סיגמואידית פלט שמחזירה את ההסתברות שעסקה תהיה הונאה:

METRICS = [
      keras.metrics.TruePositives(name='tp'),
      keras.metrics.FalsePositives(name='fp'),
      keras.metrics.TrueNegatives(name='tn'),
      keras.metrics.FalseNegatives(name='fn'), 
      keras.metrics.BinaryAccuracy(name='accuracy'),
      keras.metrics.Precision(name='precision'),
      keras.metrics.Recall(name='recall'),
      keras.metrics.AUC(name='auc'),
      keras.metrics.AUC(name='prc', curve='PR'), # precision-recall curve
]

def make_model(metrics=METRICS, output_bias=None):
  if output_bias is not None:
    output_bias = tf.keras.initializers.Constant(output_bias)
  model = keras.Sequential([
      keras.layers.Dense(
          16, activation='relu',
          input_shape=(train_features.shape[-1],)),
      keras.layers.Dropout(0.5),
      keras.layers.Dense(1, activation='sigmoid',
                         bias_initializer=output_bias),
  ])

  model.compile(
      optimizer=keras.optimizers.Adam(learning_rate=1e-3),
      loss=keras.losses.BinaryCrossentropy(),
      metrics=metrics)

  return model

הבנת מדדים שימושיים

שימו לב שיש כמה מדדים שהוגדרו לעיל שניתן לחשב על ידי המודל שיעזרו בהערכת הביצועים.

  • שליליות כוזבות ותוצאות חיוביות כוזבות הן דגימות שסווגו בצורה שגויה
  • שליליים אמיתיים וחיוביים אמיתיים הם דגימות שסווגו בצורה נכונה
  • דיוק הוא אחוז הדוגמאות המסווגות כהלכה > \(\frac{\text{true samples} }{\text{total samples} }\)
  • דיוק הוא אחוז התוצאות החיוביות החזויות שסווגו בצורה נכונה > \(\frac{\text{true positives} }{\text{true positives + false positives} }\)
  • ריקול הוא אחוז התוצאות החיוביות בפועל שסווגו כראוי > \(\frac{\text{true positives} }{\text{true positives + false negatives} }\)
  • AUC מתייחס לשטח מתחת לעקומה של עקומת מאפיין הפעלה של מקלט (ROC-AUC). מדד זה שווה להסתברות שמסווג ידרג מדגם חיובי אקראי גבוה יותר ממדגם שלילי אקראי.
  • AUPRC מתייחס לשטח מתחת לעקומת עקומת הדיוק-ריקול. מדד זה מחשב צמדי אחזור דיוק עבור ספי הסתברות שונים.

קרא עוד:

דגם בסיס

בנה את הדגם

כעת צור ואמן את המודל שלך באמצעות הפונקציה שהוגדרה קודם לכן. שימו לב שהדגם מתאים באמצעות גודל אצווה גדול יותר מברירת המחדל של 2048, זה חשוב כדי להבטיח שלכל אצווה יש סיכוי הגון להכיל כמה דגימות חיוביות. אם גודל האצווה היה קטן מדי, סביר להניח שלא יהיו להם עסקאות הונאה ללמוד מהן.

EPOCHS = 100
BATCH_SIZE = 2048

early_stopping = tf.keras.callbacks.EarlyStopping(
    monitor='val_prc', 
    verbose=1,
    patience=10,
    mode='max',
    restore_best_weights=True)
model = make_model()
model.summary()
Model: "sequential"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense (Dense)               (None, 16)                480       
                                                                 
 dropout (Dropout)           (None, 16)                0         
                                                                 
 dense_1 (Dense)             (None, 1)                 17        
                                                                 
=================================================================
Total params: 497
Trainable params: 497
Non-trainable params: 0
_________________________________________________________________

הפעל את המודל במבחן:

model.predict(train_features[:10])
array([[0.3673761 ],
       [0.35600176],
       [0.60139984],
       [0.42065838],
       [0.37041035],
       [0.27554348],
       [0.39560333],
       [0.5962162 ],
       [0.6676472 ],
       [0.47435313]], dtype=float32)

אופציונלי: הגדר את ההטיה הראשונית הנכונה.

הניחושים הראשוניים הללו אינם גדולים. אתה יודע שמערך הנתונים אינו מאוזן. הגדר את ההטיה של שכבת הפלט כדי לשקף זאת (ראה: מתכון לאימון רשתות עצביות: "init well" ). זה יכול לעזור בהתכנסות ראשונית.

עם אתחול ההטיה המוגדר כברירת מחדל, ההפסד צריך להיות בערך math.log(2) = 0.69314

results = model.evaluate(train_features, train_labels, batch_size=BATCH_SIZE, verbose=0)
print("Loss: {:0.4f}".format(results[0]))
Loss: 0.6497

ההטיה הנכונה להגדרה יכולה להיגזר מ:

\[ p_0 = pos/(pos + neg) = 1/(1+e^{-b_0}) \]

\[ b_0 = -log_e(1/p_0 - 1) \]

\[ b_0 = log_e(pos/neg)\]

initial_bias = np.log([pos/neg])
initial_bias
array([-6.35935934])

הגדר את זה בתור ההטיה הראשונית, והמודל ייתן ניחושים ראשוניים הגיוניים הרבה יותר.

זה צריך להיות ליד: pos/total = 0.0018

model = make_model(output_bias=initial_bias)
model.predict(train_features[:10])
array([[0.00085616],
       [0.0043528 ],
       [0.00127403],
       [0.00196918],
       [0.00238743],
       [0.01523864],
       [0.00139776],
       [0.01105964],
       [0.00072914],
       [0.00378807]], dtype=float32)

עם אתחול זה ההפסד הראשוני צריך להיות בערך:

\[-p_0log(p_0)-(1-p_0)log(1-p_0) = 0.01317\]

results = model.evaluate(train_features, train_labels, batch_size=BATCH_SIZE, verbose=0)
print("Loss: {:0.4f}".format(results[0]))
Loss: 0.0164

ההפסד הראשוני הזה הוא בערך פי 50 פחות ממה שהיה עם אתחול נאיבי.

בדרך זו המודל לא צריך לבלות את התקופות הראשונות רק ללמוד שדוגמאות חיוביות אינן סבירות. זה גם מקל על קריאת עלילות האובדן במהלך האימון.

בדוק את המשקולות הראשוניות

כדי להפוך את ריצות האימון השונות להשוות יותר, שמור את המשקולות של המודל הראשוני הזה בקובץ מחסום, וטען אותם לכל דגם לפני האימון:

initial_weights = os.path.join(tempfile.mkdtemp(), 'initial_weights')
model.save_weights(initial_weights)

אשר שתיקון ההטיה עוזר

לפני שתמשיך הלאה, אשר מהר שהאתחול ההטיה הזהיר אכן עזר.

אמנו את המודל במשך 20 עידנים, עם ובלי אתחול זהיר זה, והשוו את ההפסדים:

model = make_model()
model.load_weights(initial_weights)
model.layers[-1].bias.assign([0.0])
zero_bias_history = model.fit(
    train_features,
    train_labels,
    batch_size=BATCH_SIZE,
    epochs=20,
    validation_data=(val_features, val_labels), 
    verbose=0)
model = make_model()
model.load_weights(initial_weights)
careful_bias_history = model.fit(
    train_features,
    train_labels,
    batch_size=BATCH_SIZE,
    epochs=20,
    validation_data=(val_features, val_labels), 
    verbose=0)
def plot_loss(history, label, n):
  # Use a log scale on y-axis to show the wide range of values.
  plt.semilogy(history.epoch, history.history['loss'],
               color=colors[n], label='Train ' + label)
  plt.semilogy(history.epoch, history.history['val_loss'],
               color=colors[n], label='Val ' + label,
               linestyle="--")
  plt.xlabel('Epoch')
  plt.ylabel('Loss')
plot_loss(zero_bias_history, "Zero Bias", 0)
plot_loss(careful_bias_history, "Careful Bias", 1)

png

האיור שלמעלה מבהיר: במונחים של אובדן אימות, בבעיה זו, האתחול הזהיר הזה נותן יתרון ברור.

אימון הדגם

model = make_model()
model.load_weights(initial_weights)
baseline_history = model.fit(
    train_features,
    train_labels,
    batch_size=BATCH_SIZE,
    epochs=EPOCHS,
    callbacks=[early_stopping],
    validation_data=(val_features, val_labels))
Epoch 1/100
90/90 [==============================] - 3s 17ms/step - loss: 0.0131 - tp: 90.0000 - fp: 31.0000 - tn: 227426.0000 - fn: 298.0000 - accuracy: 0.9986 - precision: 0.7438 - recall: 0.2320 - auc: 0.7283 - prc: 0.2674 - val_loss: 0.0088 - val_tp: 0.0000e+00 - val_fp: 0.0000e+00 - val_tn: 45481.0000 - val_fn: 88.0000 - val_accuracy: 0.9981 - val_precision: 0.0000e+00 - val_recall: 0.0000e+00 - val_auc: 0.8910 - val_prc: 0.6002
Epoch 2/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0086 - tp: 73.0000 - fp: 23.0000 - tn: 181953.0000 - fn: 227.0000 - accuracy: 0.9986 - precision: 0.7604 - recall: 0.2433 - auc: 0.7955 - prc: 0.3433 - val_loss: 0.0058 - val_tp: 31.0000 - val_fp: 6.0000 - val_tn: 45475.0000 - val_fn: 57.0000 - val_accuracy: 0.9986 - val_precision: 0.8378 - val_recall: 0.3523 - val_auc: 0.9145 - val_prc: 0.6818
Epoch 3/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0071 - tp: 115.0000 - fp: 22.0000 - tn: 181954.0000 - fn: 185.0000 - accuracy: 0.9989 - precision: 0.8394 - recall: 0.3833 - auc: 0.8513 - prc: 0.4569 - val_loss: 0.0048 - val_tp: 43.0000 - val_fp: 7.0000 - val_tn: 45474.0000 - val_fn: 45.0000 - val_accuracy: 0.9989 - val_precision: 0.8600 - val_recall: 0.4886 - val_auc: 0.9315 - val_prc: 0.7302
Epoch 4/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0063 - tp: 121.0000 - fp: 24.0000 - tn: 181952.0000 - fn: 179.0000 - accuracy: 0.9989 - precision: 0.8345 - recall: 0.4033 - auc: 0.8797 - prc: 0.5420 - val_loss: 0.0044 - val_tp: 53.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 35.0000 - val_accuracy: 0.9990 - val_precision: 0.8413 - val_recall: 0.6023 - val_auc: 0.9315 - val_prc: 0.7347
Epoch 5/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0057 - tp: 157.0000 - fp: 27.0000 - tn: 181949.0000 - fn: 143.0000 - accuracy: 0.9991 - precision: 0.8533 - recall: 0.5233 - auc: 0.8925 - prc: 0.5852 - val_loss: 0.0042 - val_tp: 55.0000 - val_fp: 11.0000 - val_tn: 45470.0000 - val_fn: 33.0000 - val_accuracy: 0.9990 - val_precision: 0.8333 - val_recall: 0.6250 - val_auc: 0.9372 - val_prc: 0.7623
Epoch 6/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0050 - tp: 158.0000 - fp: 25.0000 - tn: 181951.0000 - fn: 142.0000 - accuracy: 0.9991 - precision: 0.8634 - recall: 0.5267 - auc: 0.9166 - prc: 0.6520 - val_loss: 0.0040 - val_tp: 56.0000 - val_fp: 11.0000 - val_tn: 45470.0000 - val_fn: 32.0000 - val_accuracy: 0.9991 - val_precision: 0.8358 - val_recall: 0.6364 - val_auc: 0.9429 - val_prc: 0.7732
Epoch 7/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0054 - tp: 157.0000 - fp: 31.0000 - tn: 181945.0000 - fn: 143.0000 - accuracy: 0.9990 - precision: 0.8351 - recall: 0.5233 - auc: 0.9066 - prc: 0.5991 - val_loss: 0.0039 - val_tp: 54.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 34.0000 - val_accuracy: 0.9990 - val_precision: 0.8438 - val_recall: 0.6136 - val_auc: 0.9429 - val_prc: 0.7902
Epoch 8/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0048 - tp: 162.0000 - fp: 36.0000 - tn: 181940.0000 - fn: 138.0000 - accuracy: 0.9990 - precision: 0.8182 - recall: 0.5400 - auc: 0.9036 - prc: 0.6601 - val_loss: 0.0037 - val_tp: 55.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 33.0000 - val_accuracy: 0.9991 - val_precision: 0.8462 - val_recall: 0.6250 - val_auc: 0.9429 - val_prc: 0.8039
Epoch 9/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0045 - tp: 169.0000 - fp: 29.0000 - tn: 181947.0000 - fn: 131.0000 - accuracy: 0.9991 - precision: 0.8535 - recall: 0.5633 - auc: 0.9137 - prc: 0.6753 - val_loss: 0.0035 - val_tp: 64.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 24.0000 - val_accuracy: 0.9993 - val_precision: 0.8649 - val_recall: 0.7273 - val_auc: 0.9486 - val_prc: 0.8152
Epoch 10/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0047 - tp: 167.0000 - fp: 33.0000 - tn: 181943.0000 - fn: 133.0000 - accuracy: 0.9991 - precision: 0.8350 - recall: 0.5567 - auc: 0.9122 - prc: 0.6616 - val_loss: 0.0034 - val_tp: 66.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 22.0000 - val_accuracy: 0.9993 - val_precision: 0.8684 - val_recall: 0.7500 - val_auc: 0.9485 - val_prc: 0.8199
Epoch 11/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0045 - tp: 177.0000 - fp: 31.0000 - tn: 181945.0000 - fn: 123.0000 - accuracy: 0.9992 - precision: 0.8510 - recall: 0.5900 - auc: 0.9206 - prc: 0.6849 - val_loss: 0.0033 - val_tp: 66.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 22.0000 - val_accuracy: 0.9993 - val_precision: 0.8684 - val_recall: 0.7500 - val_auc: 0.9486 - val_prc: 0.8273
Epoch 12/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0041 - tp: 181.0000 - fp: 25.0000 - tn: 181951.0000 - fn: 119.0000 - accuracy: 0.9992 - precision: 0.8786 - recall: 0.6033 - auc: 0.9190 - prc: 0.7070 - val_loss: 0.0032 - val_tp: 67.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 21.0000 - val_accuracy: 0.9993 - val_precision: 0.8701 - val_recall: 0.7614 - val_auc: 0.9486 - val_prc: 0.8283
Epoch 13/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0044 - tp: 170.0000 - fp: 32.0000 - tn: 181944.0000 - fn: 130.0000 - accuracy: 0.9991 - precision: 0.8416 - recall: 0.5667 - auc: 0.9189 - prc: 0.6625 - val_loss: 0.0032 - val_tp: 67.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 21.0000 - val_accuracy: 0.9993 - val_precision: 0.8701 - val_recall: 0.7614 - val_auc: 0.9485 - val_prc: 0.8298
Epoch 14/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0041 - tp: 177.0000 - fp: 33.0000 - tn: 181943.0000 - fn: 123.0000 - accuracy: 0.9991 - precision: 0.8429 - recall: 0.5900 - auc: 0.9323 - prc: 0.6980 - val_loss: 0.0031 - val_tp: 68.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 20.0000 - val_accuracy: 0.9993 - val_precision: 0.8718 - val_recall: 0.7727 - val_auc: 0.9486 - val_prc: 0.8352
Epoch 15/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0040 - tp: 180.0000 - fp: 26.0000 - tn: 181950.0000 - fn: 120.0000 - accuracy: 0.9992 - precision: 0.8738 - recall: 0.6000 - auc: 0.9291 - prc: 0.7266 - val_loss: 0.0031 - val_tp: 71.0000 - val_fp: 11.0000 - val_tn: 45470.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8659 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8368
Epoch 16/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0041 - tp: 177.0000 - fp: 29.0000 - tn: 181947.0000 - fn: 123.0000 - accuracy: 0.9992 - precision: 0.8592 - recall: 0.5900 - auc: 0.9157 - prc: 0.6905 - val_loss: 0.0031 - val_tp: 68.0000 - val_fp: 11.0000 - val_tn: 45470.0000 - val_fn: 20.0000 - val_accuracy: 0.9993 - val_precision: 0.8608 - val_recall: 0.7727 - val_auc: 0.9486 - val_prc: 0.8380
Epoch 17/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0041 - tp: 182.0000 - fp: 30.0000 - tn: 181946.0000 - fn: 118.0000 - accuracy: 0.9992 - precision: 0.8585 - recall: 0.6067 - auc: 0.9224 - prc: 0.6989 - val_loss: 0.0030 - val_tp: 69.0000 - val_fp: 11.0000 - val_tn: 45470.0000 - val_fn: 19.0000 - val_accuracy: 0.9993 - val_precision: 0.8625 - val_recall: 0.7841 - val_auc: 0.9486 - val_prc: 0.8386
Epoch 18/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0039 - tp: 182.0000 - fp: 21.0000 - tn: 181955.0000 - fn: 118.0000 - accuracy: 0.9992 - precision: 0.8966 - recall: 0.6067 - auc: 0.9208 - prc: 0.7208 - val_loss: 0.0030 - val_tp: 71.0000 - val_fp: 12.0000 - val_tn: 45469.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8554 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8402
Epoch 19/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0041 - tp: 178.0000 - fp: 32.0000 - tn: 181944.0000 - fn: 122.0000 - accuracy: 0.9992 - precision: 0.8476 - recall: 0.5933 - auc: 0.9307 - prc: 0.6853 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8750 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8441
Epoch 20/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0039 - tp: 176.0000 - fp: 25.0000 - tn: 181951.0000 - fn: 124.0000 - accuracy: 0.9992 - precision: 0.8756 - recall: 0.5867 - auc: 0.9158 - prc: 0.7067 - val_loss: 0.0030 - val_tp: 71.0000 - val_fp: 12.0000 - val_tn: 45469.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8554 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8400
Epoch 21/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0039 - tp: 182.0000 - fp: 30.0000 - tn: 181946.0000 - fn: 118.0000 - accuracy: 0.9992 - precision: 0.8585 - recall: 0.6067 - auc: 0.9275 - prc: 0.7128 - val_loss: 0.0030 - val_tp: 71.0000 - val_fp: 12.0000 - val_tn: 45469.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8554 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8407
Epoch 22/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0042 - tp: 167.0000 - fp: 32.0000 - tn: 181944.0000 - fn: 133.0000 - accuracy: 0.9991 - precision: 0.8392 - recall: 0.5567 - auc: 0.9308 - prc: 0.6904 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8765 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8445
Epoch 23/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0039 - tp: 185.0000 - fp: 29.0000 - tn: 181947.0000 - fn: 115.0000 - accuracy: 0.9992 - precision: 0.8645 - recall: 0.6167 - auc: 0.9225 - prc: 0.7176 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8875 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8463
Epoch 24/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0037 - tp: 186.0000 - fp: 28.0000 - tn: 181948.0000 - fn: 114.0000 - accuracy: 0.9992 - precision: 0.8692 - recall: 0.6200 - auc: 0.9276 - prc: 0.7350 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 17.0000 - val_accuracy: 0.9995 - val_precision: 0.8987 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8458
Epoch 25/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0040 - tp: 178.0000 - fp: 28.0000 - tn: 181948.0000 - fn: 122.0000 - accuracy: 0.9992 - precision: 0.8641 - recall: 0.5933 - auc: 0.9291 - prc: 0.7050 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8875 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8464
Epoch 26/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0036 - tp: 187.0000 - fp: 30.0000 - tn: 181946.0000 - fn: 113.0000 - accuracy: 0.9992 - precision: 0.8618 - recall: 0.6233 - auc: 0.9343 - prc: 0.7460 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 11.0000 - val_tn: 45470.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8659 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8466
Epoch 27/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0038 - tp: 187.0000 - fp: 25.0000 - tn: 181951.0000 - fn: 113.0000 - accuracy: 0.9992 - precision: 0.8821 - recall: 0.6233 - auc: 0.9242 - prc: 0.7149 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 12.0000 - val_tn: 45469.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8554 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8457
Epoch 28/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0036 - tp: 198.0000 - fp: 34.0000 - tn: 181942.0000 - fn: 102.0000 - accuracy: 0.9993 - precision: 0.8534 - recall: 0.6600 - auc: 0.9359 - prc: 0.7386 - val_loss: 0.0029 - val_tp: 69.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 19.0000 - val_accuracy: 0.9994 - val_precision: 0.8961 - val_recall: 0.7841 - val_auc: 0.9486 - val_prc: 0.8488
Epoch 29/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0035 - tp: 179.0000 - fp: 25.0000 - tn: 181951.0000 - fn: 121.0000 - accuracy: 0.9992 - precision: 0.8775 - recall: 0.5967 - auc: 0.9343 - prc: 0.7502 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8765 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8470
Epoch 30/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0037 - tp: 186.0000 - fp: 29.0000 - tn: 181947.0000 - fn: 114.0000 - accuracy: 0.9992 - precision: 0.8651 - recall: 0.6200 - auc: 0.9309 - prc: 0.7295 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8765 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8463
Epoch 31/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0036 - tp: 190.0000 - fp: 25.0000 - tn: 181951.0000 - fn: 110.0000 - accuracy: 0.9993 - precision: 0.8837 - recall: 0.6333 - auc: 0.9209 - prc: 0.7331 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8875 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8468
Epoch 32/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0037 - tp: 183.0000 - fp: 26.0000 - tn: 181950.0000 - fn: 117.0000 - accuracy: 0.9992 - precision: 0.8756 - recall: 0.6100 - auc: 0.9276 - prc: 0.7292 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 11.0000 - val_tn: 45470.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8659 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8460
Epoch 33/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0039 - tp: 176.0000 - fp: 30.0000 - tn: 181946.0000 - fn: 124.0000 - accuracy: 0.9992 - precision: 0.8544 - recall: 0.5867 - auc: 0.9209 - prc: 0.6988 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8861 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8477
Epoch 34/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0037 - tp: 182.0000 - fp: 30.0000 - tn: 181946.0000 - fn: 118.0000 - accuracy: 0.9992 - precision: 0.8585 - recall: 0.6067 - auc: 0.9326 - prc: 0.7344 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 11.0000 - val_tn: 45470.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8659 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8466
Epoch 35/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0038 - tp: 187.0000 - fp: 28.0000 - tn: 181948.0000 - fn: 113.0000 - accuracy: 0.9992 - precision: 0.8698 - recall: 0.6233 - auc: 0.9259 - prc: 0.7212 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8875 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8489
Epoch 36/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0036 - tp: 191.0000 - fp: 26.0000 - tn: 181950.0000 - fn: 109.0000 - accuracy: 0.9993 - precision: 0.8802 - recall: 0.6367 - auc: 0.9326 - prc: 0.7360 - val_loss: 0.0028 - val_tp: 69.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 19.0000 - val_accuracy: 0.9994 - val_precision: 0.8846 - val_recall: 0.7841 - val_auc: 0.9486 - val_prc: 0.8517
Epoch 37/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0037 - tp: 177.0000 - fp: 29.0000 - tn: 181947.0000 - fn: 123.0000 - accuracy: 0.9992 - precision: 0.8592 - recall: 0.5900 - auc: 0.9292 - prc: 0.7441 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8875 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8488
Epoch 38/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0035 - tp: 194.0000 - fp: 31.0000 - tn: 181945.0000 - fn: 106.0000 - accuracy: 0.9992 - precision: 0.8622 - recall: 0.6467 - auc: 0.9376 - prc: 0.7547 - val_loss: 0.0029 - val_tp: 69.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 19.0000 - val_accuracy: 0.9994 - val_precision: 0.8846 - val_recall: 0.7841 - val_auc: 0.9486 - val_prc: 0.8524
Epoch 39/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0035 - tp: 191.0000 - fp: 29.0000 - tn: 181947.0000 - fn: 109.0000 - accuracy: 0.9992 - precision: 0.8682 - recall: 0.6367 - auc: 0.9377 - prc: 0.7484 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8875 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8510
Epoch 40/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0037 - tp: 174.0000 - fp: 29.0000 - tn: 181947.0000 - fn: 126.0000 - accuracy: 0.9991 - precision: 0.8571 - recall: 0.5800 - auc: 0.9226 - prc: 0.7251 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8765 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8496
Epoch 41/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0034 - tp: 185.0000 - fp: 25.0000 - tn: 181951.0000 - fn: 115.0000 - accuracy: 0.9992 - precision: 0.8810 - recall: 0.6167 - auc: 0.9343 - prc: 0.7545 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8765 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8500
Epoch 42/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0036 - tp: 189.0000 - fp: 29.0000 - tn: 181947.0000 - fn: 111.0000 - accuracy: 0.9992 - precision: 0.8670 - recall: 0.6300 - auc: 0.9325 - prc: 0.7329 - val_loss: 0.0028 - val_tp: 69.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 19.0000 - val_accuracy: 0.9994 - val_precision: 0.8961 - val_recall: 0.7841 - val_auc: 0.9486 - val_prc: 0.8524
Epoch 43/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0038 - tp: 179.0000 - fp: 30.0000 - tn: 181946.0000 - fn: 121.0000 - accuracy: 0.9992 - precision: 0.8565 - recall: 0.5967 - auc: 0.9292 - prc: 0.7203 - val_loss: 0.0028 - val_tp: 69.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 19.0000 - val_accuracy: 0.9994 - val_precision: 0.8961 - val_recall: 0.7841 - val_auc: 0.9486 - val_prc: 0.8540
Epoch 44/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0036 - tp: 185.0000 - fp: 30.0000 - tn: 181946.0000 - fn: 115.0000 - accuracy: 0.9992 - precision: 0.8605 - recall: 0.6167 - auc: 0.9309 - prc: 0.7426 - val_loss: 0.0028 - val_tp: 71.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 17.0000 - val_accuracy: 0.9995 - val_precision: 0.8987 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8535
Epoch 45/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0036 - tp: 183.0000 - fp: 26.0000 - tn: 181950.0000 - fn: 117.0000 - accuracy: 0.9992 - precision: 0.8756 - recall: 0.6100 - auc: 0.9358 - prc: 0.7326 - val_loss: 0.0029 - val_tp: 71.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8875 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8514
Epoch 46/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0035 - tp: 190.0000 - fp: 29.0000 - tn: 181947.0000 - fn: 110.0000 - accuracy: 0.9992 - precision: 0.8676 - recall: 0.6333 - auc: 0.9326 - prc: 0.7380 - val_loss: 0.0029 - val_tp: 69.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 19.0000 - val_accuracy: 0.9994 - val_precision: 0.8846 - val_recall: 0.7841 - val_auc: 0.9486 - val_prc: 0.8535
Epoch 47/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0036 - tp: 181.0000 - fp: 28.0000 - tn: 181948.0000 - fn: 119.0000 - accuracy: 0.9992 - precision: 0.8660 - recall: 0.6033 - auc: 0.9393 - prc: 0.7377 - val_loss: 0.0028 - val_tp: 70.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8861 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8539
Epoch 48/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0037 - tp: 189.0000 - fp: 27.0000 - tn: 181949.0000 - fn: 111.0000 - accuracy: 0.9992 - precision: 0.8750 - recall: 0.6300 - auc: 0.9343 - prc: 0.7321 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8508
Epoch 49/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0035 - tp: 185.0000 - fp: 22.0000 - tn: 181954.0000 - fn: 115.0000 - accuracy: 0.9992 - precision: 0.8937 - recall: 0.6167 - auc: 0.9376 - prc: 0.7449 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8861 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8533
Epoch 50/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0035 - tp: 188.0000 - fp: 26.0000 - tn: 181950.0000 - fn: 112.0000 - accuracy: 0.9992 - precision: 0.8785 - recall: 0.6267 - auc: 0.9359 - prc: 0.7460 - val_loss: 0.0028 - val_tp: 70.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8861 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8551
Epoch 51/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0036 - tp: 191.0000 - fp: 31.0000 - tn: 181945.0000 - fn: 109.0000 - accuracy: 0.9992 - precision: 0.8604 - recall: 0.6367 - auc: 0.9375 - prc: 0.7310 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8538
Epoch 52/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0035 - tp: 186.0000 - fp: 24.0000 - tn: 181952.0000 - fn: 114.0000 - accuracy: 0.9992 - precision: 0.8857 - recall: 0.6200 - auc: 0.9309 - prc: 0.7459 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8549
Epoch 53/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0034 - tp: 185.0000 - fp: 31.0000 - tn: 181945.0000 - fn: 115.0000 - accuracy: 0.9992 - precision: 0.8565 - recall: 0.6167 - auc: 0.9393 - prc: 0.7542 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8551
Epoch 54/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0037 - tp: 176.0000 - fp: 36.0000 - tn: 181940.0000 - fn: 124.0000 - accuracy: 0.9991 - precision: 0.8302 - recall: 0.5867 - auc: 0.9342 - prc: 0.7158 - val_loss: 0.0028 - val_tp: 70.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8861 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8567
Epoch 55/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0035 - tp: 198.0000 - fp: 27.0000 - tn: 181949.0000 - fn: 102.0000 - accuracy: 0.9993 - precision: 0.8800 - recall: 0.6600 - auc: 0.9275 - prc: 0.7405 - val_loss: 0.0028 - val_tp: 70.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8861 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8563
Epoch 56/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0034 - tp: 189.0000 - fp: 30.0000 - tn: 181946.0000 - fn: 111.0000 - accuracy: 0.9992 - precision: 0.8630 - recall: 0.6300 - auc: 0.9376 - prc: 0.7492 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8861 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8554
Epoch 57/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0034 - tp: 191.0000 - fp: 27.0000 - tn: 181949.0000 - fn: 109.0000 - accuracy: 0.9993 - precision: 0.8761 - recall: 0.6367 - auc: 0.9376 - prc: 0.7500 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8556
Epoch 58/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0034 - tp: 199.0000 - fp: 31.0000 - tn: 181945.0000 - fn: 101.0000 - accuracy: 0.9993 - precision: 0.8652 - recall: 0.6633 - auc: 0.9376 - prc: 0.7427 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8545
Epoch 59/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0036 - tp: 179.0000 - fp: 28.0000 - tn: 181948.0000 - fn: 121.0000 - accuracy: 0.9992 - precision: 0.8647 - recall: 0.5967 - auc: 0.9309 - prc: 0.7359 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8572
Epoch 60/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0033 - tp: 192.0000 - fp: 31.0000 - tn: 181945.0000 - fn: 108.0000 - accuracy: 0.9992 - precision: 0.8610 - recall: 0.6400 - auc: 0.9427 - prc: 0.7627 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8574
Epoch 61/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0032 - tp: 196.0000 - fp: 25.0000 - tn: 181951.0000 - fn: 104.0000 - accuracy: 0.9993 - precision: 0.8869 - recall: 0.6533 - auc: 0.9393 - prc: 0.7707 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8579
Epoch 62/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0038 - tp: 185.0000 - fp: 32.0000 - tn: 181944.0000 - fn: 115.0000 - accuracy: 0.9992 - precision: 0.8525 - recall: 0.6167 - auc: 0.9292 - prc: 0.7110 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8861 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8575
Epoch 63/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0034 - tp: 190.0000 - fp: 20.0000 - tn: 181956.0000 - fn: 110.0000 - accuracy: 0.9993 - precision: 0.9048 - recall: 0.6333 - auc: 0.9310 - prc: 0.7612 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8565
Epoch 64/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0034 - tp: 195.0000 - fp: 29.0000 - tn: 181947.0000 - fn: 105.0000 - accuracy: 0.9993 - precision: 0.8705 - recall: 0.6500 - auc: 0.9343 - prc: 0.7484 - val_loss: 0.0028 - val_tp: 70.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8861 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8579
Epoch 65/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0038 - tp: 179.0000 - fp: 29.0000 - tn: 181947.0000 - fn: 121.0000 - accuracy: 0.9992 - precision: 0.8606 - recall: 0.5967 - auc: 0.9392 - prc: 0.7198 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8974 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8577
Epoch 66/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0032 - tp: 194.0000 - fp: 26.0000 - tn: 181950.0000 - fn: 106.0000 - accuracy: 0.9993 - precision: 0.8818 - recall: 0.6467 - auc: 0.9410 - prc: 0.7776 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8550
Epoch 67/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0031 - tp: 203.0000 - fp: 23.0000 - tn: 181953.0000 - fn: 97.0000 - accuracy: 0.9993 - precision: 0.8982 - recall: 0.6767 - auc: 0.9460 - prc: 0.7723 - val_loss: 0.0028 - val_tp: 70.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8861 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8593
Epoch 68/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0031 - tp: 197.0000 - fp: 22.0000 - tn: 181954.0000 - fn: 103.0000 - accuracy: 0.9993 - precision: 0.8995 - recall: 0.6567 - auc: 0.9393 - prc: 0.7738 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8574
Epoch 69/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0035 - tp: 190.0000 - fp: 25.0000 - tn: 181951.0000 - fn: 110.0000 - accuracy: 0.9993 - precision: 0.8837 - recall: 0.6333 - auc: 0.9359 - prc: 0.7415 - val_loss: 0.0029 - val_tp: 73.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 15.0000 - val_accuracy: 0.9995 - val_precision: 0.8795 - val_recall: 0.8295 - val_auc: 0.9486 - val_prc: 0.8544
Epoch 70/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0034 - tp: 191.0000 - fp: 29.0000 - tn: 181947.0000 - fn: 109.0000 - accuracy: 0.9992 - precision: 0.8682 - recall: 0.6367 - auc: 0.9410 - prc: 0.7520 - val_loss: 0.0028 - val_tp: 71.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 17.0000 - val_accuracy: 0.9994 - val_precision: 0.8875 - val_recall: 0.8068 - val_auc: 0.9486 - val_prc: 0.8570
Epoch 71/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0034 - tp: 188.0000 - fp: 23.0000 - tn: 181953.0000 - fn: 112.0000 - accuracy: 0.9993 - precision: 0.8910 - recall: 0.6267 - auc: 0.9343 - prc: 0.7540 - val_loss: 0.0029 - val_tp: 73.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 15.0000 - val_accuracy: 0.9995 - val_precision: 0.8795 - val_recall: 0.8295 - val_auc: 0.9486 - val_prc: 0.8549
Epoch 72/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0036 - tp: 189.0000 - fp: 33.0000 - tn: 181943.0000 - fn: 111.0000 - accuracy: 0.9992 - precision: 0.8514 - recall: 0.6300 - auc: 0.9309 - prc: 0.7316 - val_loss: 0.0029 - val_tp: 73.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 15.0000 - val_accuracy: 0.9995 - val_precision: 0.8902 - val_recall: 0.8295 - val_auc: 0.9486 - val_prc: 0.8570
Epoch 73/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0034 - tp: 197.0000 - fp: 29.0000 - tn: 181947.0000 - fn: 103.0000 - accuracy: 0.9993 - precision: 0.8717 - recall: 0.6567 - auc: 0.9376 - prc: 0.7599 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8861 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8573
Epoch 74/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0032 - tp: 194.0000 - fp: 35.0000 - tn: 181941.0000 - fn: 106.0000 - accuracy: 0.9992 - precision: 0.8472 - recall: 0.6467 - auc: 0.9427 - prc: 0.7755 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8974 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8574
Epoch 75/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0035 - tp: 182.0000 - fp: 29.0000 - tn: 181947.0000 - fn: 118.0000 - accuracy: 0.9992 - precision: 0.8626 - recall: 0.6067 - auc: 0.9293 - prc: 0.7400 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8974 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8573
Epoch 76/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0033 - tp: 194.0000 - fp: 24.0000 - tn: 181952.0000 - fn: 106.0000 - accuracy: 0.9993 - precision: 0.8899 - recall: 0.6467 - auc: 0.9376 - prc: 0.7653 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8570
Epoch 77/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0033 - tp: 204.0000 - fp: 28.0000 - tn: 181948.0000 - fn: 96.0000 - accuracy: 0.9993 - precision: 0.8793 - recall: 0.6800 - auc: 0.9475 - prc: 0.7588 - val_loss: 0.0028 - val_tp: 70.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8974 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8594
Epoch 78/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0037 - tp: 172.0000 - fp: 27.0000 - tn: 181949.0000 - fn: 128.0000 - accuracy: 0.9991 - precision: 0.8643 - recall: 0.5733 - auc: 0.9275 - prc: 0.7356 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8573
Epoch 79/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0033 - tp: 191.0000 - fp: 32.0000 - tn: 181944.0000 - fn: 109.0000 - accuracy: 0.9992 - precision: 0.8565 - recall: 0.6367 - auc: 0.9426 - prc: 0.7641 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8974 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8564
Epoch 80/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0033 - tp: 184.0000 - fp: 28.0000 - tn: 181948.0000 - fn: 116.0000 - accuracy: 0.9992 - precision: 0.8679 - recall: 0.6133 - auc: 0.9493 - prc: 0.7617 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8974 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8565
Epoch 81/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0034 - tp: 196.0000 - fp: 28.0000 - tn: 181948.0000 - fn: 104.0000 - accuracy: 0.9993 - precision: 0.8750 - recall: 0.6533 - auc: 0.9392 - prc: 0.7553 - val_loss: 0.0029 - val_tp: 69.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 19.0000 - val_accuracy: 0.9994 - val_precision: 0.8961 - val_recall: 0.7841 - val_auc: 0.9486 - val_prc: 0.8573
Epoch 82/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0033 - tp: 192.0000 - fp: 30.0000 - tn: 181946.0000 - fn: 108.0000 - accuracy: 0.9992 - precision: 0.8649 - recall: 0.6400 - auc: 0.9410 - prc: 0.7590 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8974 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8578
Epoch 83/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0032 - tp: 194.0000 - fp: 27.0000 - tn: 181949.0000 - fn: 106.0000 - accuracy: 0.9993 - precision: 0.8778 - recall: 0.6467 - auc: 0.9476 - prc: 0.7682 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 16.0000 - val_accuracy: 0.9994 - val_precision: 0.8780 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8579
Epoch 84/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0033 - tp: 195.0000 - fp: 28.0000 - tn: 181948.0000 - fn: 105.0000 - accuracy: 0.9993 - precision: 0.8744 - recall: 0.6500 - auc: 0.9393 - prc: 0.7555 - val_loss: 0.0030 - val_tp: 73.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 15.0000 - val_accuracy: 0.9995 - val_precision: 0.8795 - val_recall: 0.8295 - val_auc: 0.9486 - val_prc: 0.8559
Epoch 85/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0033 - tp: 196.0000 - fp: 26.0000 - tn: 181950.0000 - fn: 104.0000 - accuracy: 0.9993 - precision: 0.8829 - recall: 0.6533 - auc: 0.9459 - prc: 0.7552 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8578
Epoch 86/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0033 - tp: 194.0000 - fp: 25.0000 - tn: 181951.0000 - fn: 106.0000 - accuracy: 0.9993 - precision: 0.8858 - recall: 0.6467 - auc: 0.9392 - prc: 0.7643 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8581
Epoch 87/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0033 - tp: 189.0000 - fp: 28.0000 - tn: 181948.0000 - fn: 111.0000 - accuracy: 0.9992 - precision: 0.8710 - recall: 0.6300 - auc: 0.9459 - prc: 0.7549 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8974 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8596
Epoch 88/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0033 - tp: 187.0000 - fp: 25.0000 - tn: 181951.0000 - fn: 113.0000 - accuracy: 0.9992 - precision: 0.8821 - recall: 0.6233 - auc: 0.9492 - prc: 0.7707 - val_loss: 0.0030 - val_tp: 73.0000 - val_fp: 13.0000 - val_tn: 45468.0000 - val_fn: 15.0000 - val_accuracy: 0.9994 - val_precision: 0.8488 - val_recall: 0.8295 - val_auc: 0.9486 - val_prc: 0.8552
Epoch 89/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0032 - tp: 199.0000 - fp: 28.0000 - tn: 181948.0000 - fn: 101.0000 - accuracy: 0.9993 - precision: 0.8767 - recall: 0.6633 - auc: 0.9426 - prc: 0.7671 - val_loss: 0.0029 - val_tp: 73.0000 - val_fp: 10.0000 - val_tn: 45471.0000 - val_fn: 15.0000 - val_accuracy: 0.9995 - val_precision: 0.8795 - val_recall: 0.8295 - val_auc: 0.9486 - val_prc: 0.8580
Epoch 90/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0034 - tp: 189.0000 - fp: 28.0000 - tn: 181948.0000 - fn: 111.0000 - accuracy: 0.9992 - precision: 0.8710 - recall: 0.6300 - auc: 0.9408 - prc: 0.7518 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8974 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8601
Epoch 91/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0032 - tp: 196.0000 - fp: 21.0000 - tn: 181955.0000 - fn: 104.0000 - accuracy: 0.9993 - precision: 0.9032 - recall: 0.6533 - auc: 0.9426 - prc: 0.7775 - val_loss: 0.0030 - val_tp: 73.0000 - val_fp: 11.0000 - val_tn: 45470.0000 - val_fn: 15.0000 - val_accuracy: 0.9994 - val_precision: 0.8690 - val_recall: 0.8295 - val_auc: 0.9486 - val_prc: 0.8576
Epoch 92/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0033 - tp: 198.0000 - fp: 31.0000 - tn: 181945.0000 - fn: 102.0000 - accuracy: 0.9993 - precision: 0.8646 - recall: 0.6600 - auc: 0.9475 - prc: 0.7590 - val_loss: 0.0029 - val_tp: 69.0000 - val_fp: 7.0000 - val_tn: 45474.0000 - val_fn: 19.0000 - val_accuracy: 0.9994 - val_precision: 0.9079 - val_recall: 0.7841 - val_auc: 0.9486 - val_prc: 0.8609
Epoch 93/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0032 - tp: 191.0000 - fp: 23.0000 - tn: 181953.0000 - fn: 109.0000 - accuracy: 0.9993 - precision: 0.8925 - recall: 0.6367 - auc: 0.9325 - prc: 0.7718 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8574
Epoch 94/100
90/90 [==============================] - 1s 8ms/step - loss: 0.0032 - tp: 197.0000 - fp: 24.0000 - tn: 181952.0000 - fn: 103.0000 - accuracy: 0.9993 - precision: 0.8914 - recall: 0.6567 - auc: 0.9375 - prc: 0.7699 - val_loss: 0.0030 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8573
Epoch 95/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0032 - tp: 195.0000 - fp: 32.0000 - tn: 181944.0000 - fn: 105.0000 - accuracy: 0.9992 - precision: 0.8590 - recall: 0.6500 - auc: 0.9477 - prc: 0.7725 - val_loss: 0.0030 - val_tp: 72.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.8889 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8584
Epoch 96/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0030 - tp: 201.0000 - fp: 24.0000 - tn: 181952.0000 - fn: 99.0000 - accuracy: 0.9993 - precision: 0.8933 - recall: 0.6700 - auc: 0.9410 - prc: 0.7920 - val_loss: 0.0029 - val_tp: 72.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 16.0000 - val_accuracy: 0.9995 - val_precision: 0.9000 - val_recall: 0.8182 - val_auc: 0.9486 - val_prc: 0.8595
Epoch 97/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0032 - tp: 191.0000 - fp: 29.0000 - tn: 181947.0000 - fn: 109.0000 - accuracy: 0.9992 - precision: 0.8682 - recall: 0.6367 - auc: 0.9509 - prc: 0.7784 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8974 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8589
Epoch 98/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0031 - tp: 206.0000 - fp: 21.0000 - tn: 181955.0000 - fn: 94.0000 - accuracy: 0.9994 - precision: 0.9075 - recall: 0.6867 - auc: 0.9493 - prc: 0.7891 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8974 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8591
Epoch 99/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0035 - tp: 179.0000 - fp: 25.0000 - tn: 181951.0000 - fn: 121.0000 - accuracy: 0.9992 - precision: 0.8775 - recall: 0.5967 - auc: 0.9408 - prc: 0.7521 - val_loss: 0.0030 - val_tp: 73.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 15.0000 - val_accuracy: 0.9995 - val_precision: 0.8902 - val_recall: 0.8295 - val_auc: 0.9486 - val_prc: 0.8583
Epoch 100/100
90/90 [==============================] - 1s 7ms/step - loss: 0.0034 - tp: 196.0000 - fp: 31.0000 - tn: 181945.0000 - fn: 104.0000 - accuracy: 0.9993 - precision: 0.8634 - recall: 0.6533 - auc: 0.9508 - prc: 0.7584 - val_loss: 0.0029 - val_tp: 70.0000 - val_fp: 8.0000 - val_tn: 45473.0000 - val_fn: 18.0000 - val_accuracy: 0.9994 - val_precision: 0.8974 - val_recall: 0.7955 - val_auc: 0.9486 - val_prc: 0.8592

בדוק את היסטוריית האימונים

בסעיף זה, תפיקו עלילות של דיוק והפסד של הדגם שלכם על ערכת ההדרכה והאימות. אלה שימושיים לבדיקת התאמה יתר, עליה תוכל ללמוד עוד במדריך כושר יתר וחוסר כושר .

בנוסף, אתה יכול לייצר עלילות אלה עבור כל אחד מהמדדים שיצרת למעלה. שליליות כוזבות נכללות כדוגמה.

def plot_metrics(history):
  metrics = ['loss', 'prc', 'precision', 'recall']
  for n, metric in enumerate(metrics):
    name = metric.replace("_"," ").capitalize()
    plt.subplot(2,2,n+1)
    plt.plot(history.epoch, history.history[metric], color=colors[0], label='Train')
    plt.plot(history.epoch, history.history['val_'+metric],
             color=colors[0], linestyle="--", label='Val')
    plt.xlabel('Epoch')
    plt.ylabel(name)
    if metric == 'loss':
      plt.ylim([0, plt.ylim()[1]])
    elif metric == 'auc':
      plt.ylim([0.8,1])
    else:
      plt.ylim([0,1])

    plt.legend();
plot_metrics(baseline_history)

png

הערכת מדדים

אתה יכול להשתמש במטריצת בלבול כדי לסכם את התוויות בפועל לעומת התוויות החזויות, כאשר ציר X הוא התווית החזויה וציר Y הוא התווית בפועל:

train_predictions_baseline = model.predict(train_features, batch_size=BATCH_SIZE)
test_predictions_baseline = model.predict(test_features, batch_size=BATCH_SIZE)
def plot_cm(labels, predictions, p=0.5):
  cm = confusion_matrix(labels, predictions > p)
  plt.figure(figsize=(5,5))
  sns.heatmap(cm, annot=True, fmt="d")
  plt.title('Confusion matrix @{:.2f}'.format(p))
  plt.ylabel('Actual label')
  plt.xlabel('Predicted label')

  print('Legitimate Transactions Detected (True Negatives): ', cm[0][0])
  print('Legitimate Transactions Incorrectly Detected (False Positives): ', cm[0][1])
  print('Fraudulent Transactions Missed (False Negatives): ', cm[1][0])
  print('Fraudulent Transactions Detected (True Positives): ', cm[1][1])
  print('Total Fraudulent Transactions: ', np.sum(cm[1]))

הערך את המודל שלך במערך הנתונים של הבדיקה והצג את התוצאות עבור המדדים שיצרת למעלה:

baseline_results = model.evaluate(test_features, test_labels,
                                  batch_size=BATCH_SIZE, verbose=0)
for name, value in zip(model.metrics_names, baseline_results):
  print(name, ': ', value)
print()

plot_cm(test_labels, test_predictions_baseline)
loss :  0.0030180125031620264
tp :  82.0
fp :  5.0
tn :  56853.0
fn :  22.0
accuracy :  0.9995260238647461
precision :  0.9425287246704102
recall :  0.7884615659713745
auc :  0.9372756481170654
prc :  0.8531103730201721

Legitimate Transactions Detected (True Negatives):  56853
Legitimate Transactions Incorrectly Detected (False Positives):  5
Fraudulent Transactions Missed (False Negatives):  22
Fraudulent Transactions Detected (True Positives):  82
Total Fraudulent Transactions:  104

png

אם המודל היה חוזה הכל בצורה מושלמת, זו תהיה מטריצה ​​אלכסונית שבה ערכים מחוץ לאלכסון הראשי, המצביעים על תחזיות שגויות, יהיו אפס. במקרה זה המטריצה ​​מראה שיש לך יחסית מעט תוצאות שגויות, כלומר היו מעט יחסית עסקאות לגיטימיות שסומנו באופן שגוי. עם זאת, סביר להניח שתרצה לקבל אפילו פחות תוצאות שליליות שגויות למרות העלות של הגדלת מספר התוצאות השווא. פשרה זו עשויה להיות עדיפה מכיוון שליליות כוזבות יאפשרו לבצע עסקאות הונאה, בעוד שתוצאות חיוביות כוזבות עלולות לגרום למייל להישלח ללקוח כדי לבקש ממנו לאמת את פעילות הכרטיס שלו.

תכנן את ה-ROC

עכשיו תכננו את ה- ROC . העלילה הזו שימושית מכיוון שהיא מציגה במבט חטוף את טווח הביצועים שאליו המודל יכול להגיע רק על ידי כוונון סף הפלט.

def plot_roc(name, labels, predictions, **kwargs):
  fp, tp, _ = sklearn.metrics.roc_curve(labels, predictions)

  plt.plot(100*fp, 100*tp, label=name, linewidth=2, **kwargs)
  plt.xlabel('False positives [%]')
  plt.ylabel('True positives [%]')
  plt.xlim([-0.5,20])
  plt.ylim([80,100.5])
  plt.grid(True)
  ax = plt.gca()
  ax.set_aspect('equal')
plot_roc("Train Baseline", train_labels, train_predictions_baseline, color=colors[0])
plot_roc("Test Baseline", test_labels, test_predictions_baseline, color=colors[0], linestyle='--')
plt.legend(loc='lower right');

png

תכננו את AUPRC

עכשיו תכננו את AUPRC . שטח מתחת לעקומת הדיוק-היזכרות המשולבת, המתקבלת על-ידי שרטוט (ריקול, דיוק) נקודות עבור ערכים שונים של סף הסיווג. תלוי איך זה מחושב, PR AUC עשוי להיות שווה ערך לדיוק הממוצע של המודל.

def plot_prc(name, labels, predictions, **kwargs):
    precision, recall, _ = sklearn.metrics.precision_recall_curve(labels, predictions)

    plt.plot(precision, recall, label=name, linewidth=2, **kwargs)
    plt.xlabel('Recall')
    plt.ylabel('Precision')
    plt.grid(True)
    ax = plt.gca()
    ax.set_aspect('equal')
plot_prc("Train Baseline", train_labels, train_predictions_baseline, color=colors[0])
plot_prc("Test Baseline", test_labels, test_predictions_baseline, color=colors[0], linestyle='--')
plt.legend(loc='lower right');

png

נראה שהדיוק גבוה יחסית, אבל הריקול והשטח מתחת לעקומת ה-ROC (AUC) אינם גבוהים כפי שתרצה. מסווגים מתמודדים לעתים קרובות עם אתגרים כאשר מנסים למקסם הן את הדיוק והן את הזכירה, מה שנכון במיוחד כאשר עובדים עם מערכי נתונים לא מאוזנים. חשוב לקחת בחשבון את העלויות של סוגים שונים של טעויות בהקשר לבעיה שאכפת לך ממנה. בדוגמה זו, לשלילה שגויה (החמצה עסקת הונאה) עשויה להיות עלות כספית, בעוד שלשלילי שגוי (עסקה מסומנת באופן שגוי כמרמה) עשויה להפחית את אושר המשתמש.

משקלי כיתה

חשב משקלי כיתות

המטרה היא לזהות עסקאות הונאה, אבל אין לך הרבה מהדוגמאות החיוביות האלה לעבוד איתן, אז תרצה שהמסווג ישקול בכבדות את הדוגמאות המעטות הזמינות. אתה יכול לעשות זאת על ידי העברת משקולות Keras עבור כל מחלקה דרך פרמטר. אלו יגרמו למודל "להקדיש יותר תשומת לב" לדוגמאות מכיתה מיוצגת בתת-ייצוג.

# Scaling by total/2 helps keep the loss to a similar magnitude.
# The sum of the weights of all examples stays the same.
weight_for_0 = (1 / neg) * (total / 2.0)
weight_for_1 = (1 / pos) * (total / 2.0)

class_weight = {0: weight_for_0, 1: weight_for_1}

print('Weight for class 0: {:.2f}'.format(weight_for_0))
print('Weight for class 1: {:.2f}'.format(weight_for_1))
Weight for class 0: 0.50
Weight for class 1: 289.44

אימון דוגמנית עם משקלי כיתות

כעת נסה לאמן מחדש ולהעריך את המודל עם משקלי כיתות כדי לראות כיצד זה משפיע על התחזיות.

weighted_model = make_model()
weighted_model.load_weights(initial_weights)

weighted_history = weighted_model.fit(
    train_features,
    train_labels,
    batch_size=BATCH_SIZE,
    epochs=EPOCHS,
    callbacks=[early_stopping],
    validation_data=(val_features, val_labels),
    # The class weights go here
    class_weight=class_weight)
Epoch 1/100
90/90 [==============================] - 3s 17ms/step - loss: 2.4908 - tp: 117.0000 - fp: 127.0000 - tn: 238707.0000 - fn: 287.0000 - accuracy: 0.9983 - precision: 0.4795 - recall: 0.2896 - auc: 0.7329 - prc: 0.2830 - val_loss: 0.0091 - val_tp: 25.0000 - val_fp: 9.0000 - val_tn: 45472.0000 - val_fn: 63.0000 - val_accuracy: 0.9984 - val_precision: 0.7353 - val_recall: 0.2841 - val_auc: 0.9311 - val_prc: 0.4668
Epoch 2/100
90/90 [==============================] - 1s 8ms/step - loss: 1.1350 - tp: 128.0000 - fp: 367.0000 - tn: 181609.0000 - fn: 172.0000 - accuracy: 0.9970 - precision: 0.2586 - recall: 0.4267 - auc: 0.8473 - prc: 0.3033 - val_loss: 0.0105 - val_tp: 63.0000 - val_fp: 22.0000 - val_tn: 45459.0000 - val_fn: 25.0000 - val_accuracy: 0.9990 - val_precision: 0.7412 - val_recall: 0.7159 - val_auc: 0.9546 - val_prc: 0.6863
Epoch 3/100
90/90 [==============================] - 1s 7ms/step - loss: 0.7347 - tp: 181.0000 - fp: 1105.0000 - tn: 180871.0000 - fn: 119.0000 - accuracy: 0.9933 - precision: 0.1407 - recall: 0.6033 - auc: 0.8961 - prc: 0.3683 - val_loss: 0.0166 - val_tp: 75.0000 - val_fp: 59.0000 - val_tn: 45422.0000 - val_fn: 13.0000 - val_accuracy: 0.9984 - val_precision: 0.5597 - val_recall: 0.8523 - val_auc: 0.9531 - val_prc: 0.7303
Epoch 4/100
90/90 [==============================] - 1s 8ms/step - loss: 0.5927 - tp: 204.0000 - fp: 2153.0000 - tn: 179823.0000 - fn: 96.0000 - accuracy: 0.9877 - precision: 0.0866 - recall: 0.6800 - auc: 0.9051 - prc: 0.3409 - val_loss: 0.0255 - val_tp: 79.0000 - val_fp: 207.0000 - val_tn: 45274.0000 - val_fn: 9.0000 - val_accuracy: 0.9953 - val_precision: 0.2762 - val_recall: 0.8977 - val_auc: 0.9580 - val_prc: 0.7195
Epoch 5/100
90/90 [==============================] - 1s 8ms/step - loss: 0.4751 - tp: 230.0000 - fp: 2891.0000 - tn: 179085.0000 - fn: 70.0000 - accuracy: 0.9838 - precision: 0.0737 - recall: 0.7667 - auc: 0.9168 - prc: 0.2923 - val_loss: 0.0344 - val_tp: 79.0000 - val_fp: 376.0000 - val_tn: 45105.0000 - val_fn: 9.0000 - val_accuracy: 0.9916 - val_precision: 0.1736 - val_recall: 0.8977 - val_auc: 0.9577 - val_prc: 0.6827
Epoch 6/100
90/90 [==============================] - 1s 7ms/step - loss: 0.3777 - tp: 239.0000 - fp: 3813.0000 - tn: 178163.0000 - fn: 61.0000 - accuracy: 0.9787 - precision: 0.0590 - recall: 0.7967 - auc: 0.9352 - prc: 0.2408 - val_loss: 0.0441 - val_tp: 80.0000 - val_fp: 502.0000 - val_tn: 44979.0000 - val_fn: 8.0000 - val_accuracy: 0.9888 - val_precision: 0.1375 - val_recall: 0.9091 - val_auc: 0.9602 - val_prc: 0.6813
Epoch 7/100
90/90 [==============================] - 1s 7ms/step - loss: 0.3936 - tp: 245.0000 - fp: 4649.0000 - tn: 177327.0000 - fn: 55.0000 - accuracy: 0.9742 - precision: 0.0501 - recall: 0.8167 - auc: 0.9277 - prc: 0.2108 - val_loss: 0.0540 - val_tp: 80.0000 - val_fp: 630.0000 - val_tn: 44851.0000 - val_fn: 8.0000 - val_accuracy: 0.9860 - val_precision: 0.1127 - val_recall: 0.9091 - val_auc: 0.9610 - val_prc: 0.6825
Epoch 8/100
90/90 [==============================] - 1s 7ms/step - loss: 0.3975 - tp: 243.0000 - fp: 5322.0000 - tn: 176654.0000 - fn: 57.0000 - accuracy: 0.9705 - precision: 0.0437 - recall: 0.8100 - auc: 0.9232 - prc: 0.2013 - val_loss: 0.0623 - val_tp: 80.0000 - val_fp: 719.0000 - val_tn: 44762.0000 - val_fn: 8.0000 - val_accuracy: 0.9840 - val_precision: 0.1001 - val_recall: 0.9091 - val_auc: 0.9614 - val_prc: 0.6878
Epoch 9/100
90/90 [==============================] - 1s 7ms/step - loss: 0.3066 - tp: 254.0000 - fp: 5795.0000 - tn: 176181.0000 - fn: 46.0000 - accuracy: 0.9680 - precision: 0.0420 - recall: 0.8467 - auc: 0.9470 - prc: 0.1891 - val_loss: 0.0672 - val_tp: 80.0000 - val_fp: 758.0000 - val_tn: 44723.0000 - val_fn: 8.0000 - val_accuracy: 0.9832 - val_precision: 0.0955 - val_recall: 0.9091 - val_auc: 0.9631 - val_prc: 0.6807
Epoch 10/100
90/90 [==============================] - 1s 8ms/step - loss: 0.3111 - tp: 254.0000 - fp: 5995.0000 - tn: 175981.0000 - fn: 46.0000 - accuracy: 0.9669 - precision: 0.0406 - recall: 0.8467 - auc: 0.9473 - prc: 0.1843 - val_loss: 0.0722 - val_tp: 80.0000 - val_fp: 806.0000 - val_tn: 44675.0000 - val_fn: 8.0000 - val_accuracy: 0.9821 - val_precision: 0.0903 - val_recall: 0.9091 - val_auc: 0.9649 - val_prc: 0.6524
Epoch 11/100
90/90 [==============================] - 1s 8ms/step - loss: 0.3300 - tp: 251.0000 - fp: 6288.0000 - tn: 175688.0000 - fn: 49.0000 - accuracy: 0.9652 - precision: 0.0384 - recall: 0.8367 - auc: 0.9402 - prc: 0.1755 - val_loss: 0.0767 - val_tp: 80.0000 - val_fp: 843.0000 - val_tn: 44638.0000 - val_fn: 8.0000 - val_accuracy: 0.9813 - val_precision: 0.0867 - val_recall: 0.9091 - val_auc: 0.9652 - val_prc: 0.6501
Epoch 12/100
90/90 [==============================] - 1s 8ms/step - loss: 0.3178 - tp: 254.0000 - fp: 6563.0000 - tn: 175413.0000 - fn: 46.0000 - accuracy: 0.9637 - precision: 0.0373 - recall: 0.8467 - auc: 0.9421 - prc: 0.1784 - val_loss: 0.0796 - val_tp: 80.0000 - val_fp: 875.0000 - val_tn: 44606.0000 - val_fn: 8.0000 - val_accuracy: 0.9806 - val_precision: 0.0838 - val_recall: 0.9091 - val_auc: 0.9661 - val_prc: 0.6500
Epoch 13/100
89/90 [============================>.] - ETA: 0s - loss: 0.2418 - tp: 264.0000 - fp: 6410.0000 - tn: 175562.0000 - fn: 36.0000 - accuracy: 0.9646 - precision: 0.0396 - recall: 0.8800 - auc: 0.9620 - prc: 0.1929Restoring model weights from the end of the best epoch: 3.
90/90 [==============================] - 1s 8ms/step - loss: 0.2417 - tp: 264.0000 - fp: 6410.0000 - tn: 175566.0000 - fn: 36.0000 - accuracy: 0.9646 - precision: 0.0396 - recall: 0.8800 - auc: 0.9620 - prc: 0.1929 - val_loss: 0.0791 - val_tp: 80.0000 - val_fp: 866.0000 - val_tn: 44615.0000 - val_fn: 8.0000 - val_accuracy: 0.9808 - val_precision: 0.0846 - val_recall: 0.9091 - val_auc: 0.9673 - val_prc: 0.6443
Epoch 13: early stopping

בדוק את היסטוריית האימונים

plot_metrics(weighted_history)

png

הערכת מדדים

train_predictions_weighted = weighted_model.predict(train_features, batch_size=BATCH_SIZE)
test_predictions_weighted = weighted_model.predict(test_features, batch_size=BATCH_SIZE)
weighted_results = weighted_model.evaluate(test_features, test_labels,
                                           batch_size=BATCH_SIZE, verbose=0)
for name, value in zip(weighted_model.metrics_names, weighted_results):
  print(name, ': ', value)
print()

plot_cm(test_labels, test_predictions_weighted)
loss :  0.015673985704779625
tp :  88.0
fp :  64.0
tn :  56794.0
fn :  16.0
accuracy :  0.9985955357551575
precision :  0.5789473652839661
recall :  0.8461538553237915
auc :  0.9661166071891785
prc :  0.7658032178878784

Legitimate Transactions Detected (True Negatives):  56794
Legitimate Transactions Incorrectly Detected (False Positives):  64
Fraudulent Transactions Missed (False Negatives):  16
Fraudulent Transactions Detected (True Positives):  88
Total Fraudulent Transactions:  104

png

כאן אתה יכול לראות שעם משקלי מחלקה הדיוק והדיוק נמוכים יותר כי יש יותר תוצאות חיוביות שגויות, אבל לעומת זאת ה-recall וה-AUC גבוהים יותר כי המודל מצא גם יותר חיובי אמיתי. למרות דיוק נמוך יותר, למודל זה יש ריקול גבוה יותר (ומזהה עסקאות הונאה יותר). כמובן, יש עלות לשני סוגי השגיאות (גם לא תרצה להטריד משתמשים על ידי סימון של יותר מדי עסקאות לגיטימיות כמרמה). שקול היטב את ההחלפות בין סוגי השגיאות השונות הללו עבור היישום שלך.

תכנן את ה-ROC

plot_roc("Train Baseline", train_labels, train_predictions_baseline, color=colors[0])
plot_roc("Test Baseline", test_labels, test_predictions_baseline, color=colors[0], linestyle='--')

plot_roc("Train Weighted", train_labels, train_predictions_weighted, color=colors[1])
plot_roc("Test Weighted", test_labels, test_predictions_weighted, color=colors[1], linestyle='--')


plt.legend(loc='lower right');

png

תכננו את AUPRC

plot_prc("Train Baseline", train_labels, train_predictions_baseline, color=colors[0])
plot_prc("Test Baseline", test_labels, test_predictions_baseline, color=colors[0], linestyle='--')

plot_prc("Train Weighted", train_labels, train_predictions_weighted, color=colors[1])
plot_prc("Test Weighted", test_labels, test_predictions_weighted, color=colors[1], linestyle='--')


plt.legend(loc='lower right');

png

דגימת יתר

דגימת יתר של מעמד המיעוט

גישה קשורה תהיה דגימה מחדש של מערך הנתונים על ידי דגימת יתר של מחלקת המיעוט.

pos_features = train_features[bool_train_labels]
neg_features = train_features[~bool_train_labels]

pos_labels = train_labels[bool_train_labels]
neg_labels = train_labels[~bool_train_labels]

שימוש ב-NumPy

אתה יכול לאזן את מערך הנתונים באופן ידני על ידי בחירת המספר הנכון של מדדים אקראיים מתוך הדוגמאות החיוביות:

ids = np.arange(len(pos_features))
choices = np.random.choice(ids, len(neg_features))

res_pos_features = pos_features[choices]
res_pos_labels = pos_labels[choices]

res_pos_features.shape
(181976, 29)
resampled_features = np.concatenate([res_pos_features, neg_features], axis=0)
resampled_labels = np.concatenate([res_pos_labels, neg_labels], axis=0)

order = np.arange(len(resampled_labels))
np.random.shuffle(order)
resampled_features = resampled_features[order]
resampled_labels = resampled_labels[order]

resampled_features.shape
(363952, 29)

שימוש ב- tf.data

אם אתה משתמש ב- tf.data , הדרך הקלה ביותר לייצר דוגמאות מאוזנות היא להתחיל עם מערך נתונים positive negative ולמזג אותם. עיין במדריך tf.data לקבלת דוגמאות נוספות.

BUFFER_SIZE = 100000

def make_ds(features, labels):
  ds = tf.data.Dataset.from_tensor_slices((features, labels))#.cache()
  ds = ds.shuffle(BUFFER_SIZE).repeat()
  return ds

pos_ds = make_ds(pos_features, pos_labels)
neg_ds = make_ds(neg_features, neg_labels)

כל מערך נתונים מספק (feature, label) זוגות:

for features, label in pos_ds.take(1):
  print("Features:\n", features.numpy())
  print()
  print("Label: ", label.numpy())
Features:
 [-1.70995646e+00  6.07169433e-02 -4.04871449e+00  2.40588467e+00
  3.27081930e-01 -1.17238978e+00 -1.09074333e+00  5.57028845e-01
 -2.90711834e+00 -4.55481109e+00  3.65557488e+00 -4.64769769e+00
  1.02148437e+00 -5.00000000e+00 -6.47163060e-01 -5.00000000e+00
 -5.00000000e+00 -1.89067352e+00  2.36913527e+00  3.35002702e-01
  1.16080677e+00  1.71862788e+00  2.98862257e+00 -1.93354761e-01
  2.33545393e+00 -2.13017771e-03  2.55719520e+00  1.14788658e-02
  1.39790139e+00]

Label:  1

מיזוג את השניים יחד באמצעות tf.data.Dataset.sample_from_datasets :

resampled_ds = tf.data.Dataset.sample_from_datasets([pos_ds, neg_ds], weights=[0.5, 0.5])
resampled_ds = resampled_ds.batch(BATCH_SIZE).prefetch(2)
for features, label in resampled_ds.take(1):
  print(label.numpy().mean())
0.4990234375

כדי להשתמש במערך הנתונים הזה, תזדקק למספר השלבים לכל תקופה.

ההגדרה של "עידן" במקרה זה פחות ברורה. נניח שזה מספר האצוות הנדרש כדי לראות כל דוגמה שלילית פעם אחת:

resampled_steps_per_epoch = np.ceil(2.0*neg/BATCH_SIZE)
resampled_steps_per_epoch
278.0

התאמן על הנתונים שנדגמו יתר על המידה

כעת נסה לאמן את המודל עם מערך הנתונים שנדגמו מחדש במקום להשתמש במשקלי כיתות כדי לראות כיצד שיטות אלו משתווים.

resampled_model = make_model()
resampled_model.load_weights(initial_weights)

# Reset the bias to zero, since this dataset is balanced.
output_layer = resampled_model.layers[-1] 
output_layer.bias.assign([0])

val_ds = tf.data.Dataset.from_tensor_slices((val_features, val_labels)).cache()
val_ds = val_ds.batch(BATCH_SIZE).prefetch(2) 

resampled_history = resampled_model.fit(
    resampled_ds,
    epochs=EPOCHS,
    steps_per_epoch=resampled_steps_per_epoch,
    callbacks=[early_stopping],
    validation_data=val_ds)
Epoch 1/100
278/278 [==============================] - 11s 34ms/step - loss: 0.5088 - tp: 230540.0000 - fp: 87893.0000 - tn: 253318.0000 - fn: 54555.0000 - accuracy: 0.7726 - precision: 0.7240 - recall: 0.8086 - auc: 0.8724 - prc: 0.8948 - val_loss: 0.2745 - val_tp: 80.0000 - val_fp: 1324.0000 - val_tn: 44157.0000 - val_fn: 8.0000 - val_accuracy: 0.9708 - val_precision: 0.0570 - val_recall: 0.9091 - val_auc: 0.9607 - val_prc: 0.7477
Epoch 2/100
278/278 [==============================] - 9s 31ms/step - loss: 0.2251 - tp: 255911.0000 - fp: 18957.0000 - tn: 265276.0000 - fn: 29200.0000 - accuracy: 0.9154 - precision: 0.9310 - recall: 0.8976 - auc: 0.9673 - prc: 0.9748 - val_loss: 0.1428 - val_tp: 81.0000 - val_fp: 760.0000 - val_tn: 44721.0000 - val_fn: 7.0000 - val_accuracy: 0.9832 - val_precision: 0.0963 - val_recall: 0.9205 - val_auc: 0.9745 - val_prc: 0.7576
Epoch 3/100
278/278 [==============================] - 9s 32ms/step - loss: 0.1657 - tp: 261422.0000 - fp: 10411.0000 - tn: 274107.0000 - fn: 23404.0000 - accuracy: 0.9406 - precision: 0.9617 - recall: 0.9178 - auc: 0.9824 - prc: 0.9856 - val_loss: 0.0971 - val_tp: 79.0000 - val_fp: 703.0000 - val_tn: 44778.0000 - val_fn: 9.0000 - val_accuracy: 0.9844 - val_precision: 0.1010 - val_recall: 0.8977 - val_auc: 0.9770 - val_prc: 0.7608
Epoch 4/100
278/278 [==============================] - 9s 31ms/step - loss: 0.1414 - tp: 264138.0000 - fp: 8560.0000 - tn: 275567.0000 - fn: 21079.0000 - accuracy: 0.9479 - precision: 0.9686 - recall: 0.9261 - auc: 0.9875 - prc: 0.9892 - val_loss: 0.0822 - val_tp: 79.0000 - val_fp: 720.0000 - val_tn: 44761.0000 - val_fn: 9.0000 - val_accuracy: 0.9840 - val_precision: 0.0989 - val_recall: 0.8977 - val_auc: 0.9785 - val_prc: 0.7223
Epoch 5/100
278/278 [==============================] - 9s 31ms/step - loss: 0.1254 - tp: 265494.0000 - fp: 7921.0000 - tn: 277121.0000 - fn: 18808.0000 - accuracy: 0.9531 - precision: 0.9710 - recall: 0.9338 - auc: 0.9906 - prc: 0.9915 - val_loss: 0.0726 - val_tp: 79.0000 - val_fp: 693.0000 - val_tn: 44788.0000 - val_fn: 9.0000 - val_accuracy: 0.9846 - val_precision: 0.1023 - val_recall: 0.8977 - val_auc: 0.9790 - val_prc: 0.7068
Epoch 6/100
278/278 [==============================] - 8s 31ms/step - loss: 0.1150 - tp: 267476.0000 - fp: 7403.0000 - tn: 277131.0000 - fn: 17334.0000 - accuracy: 0.9566 - precision: 0.9731 - recall: 0.9391 - auc: 0.9925 - prc: 0.9929 - val_loss: 0.0651 - val_tp: 79.0000 - val_fp: 647.0000 - val_tn: 44834.0000 - val_fn: 9.0000 - val_accuracy: 0.9856 - val_precision: 0.1088 - val_recall: 0.8977 - val_auc: 0.9797 - val_prc: 0.7061
Epoch 7/100
278/278 [==============================] - 8s 31ms/step - loss: 0.1065 - tp: 268558.0000 - fp: 7008.0000 - tn: 277671.0000 - fn: 16107.0000 - accuracy: 0.9594 - precision: 0.9746 - recall: 0.9434 - auc: 0.9938 - prc: 0.9939 - val_loss: 0.0602 - val_tp: 79.0000 - val_fp: 616.0000 - val_tn: 44865.0000 - val_fn: 9.0000 - val_accuracy: 0.9863 - val_precision: 0.1137 - val_recall: 0.8977 - val_auc: 0.9795 - val_prc: 0.7053
Epoch 8/100
278/278 [==============================] - 9s 31ms/step - loss: 0.1010 - tp: 269426.0000 - fp: 6824.0000 - tn: 277689.0000 - fn: 15405.0000 - accuracy: 0.9610 - precision: 0.9753 - recall: 0.9459 - auc: 0.9945 - prc: 0.9944 - val_loss: 0.0556 - val_tp: 79.0000 - val_fp: 580.0000 - val_tn: 44901.0000 - val_fn: 9.0000 - val_accuracy: 0.9871 - val_precision: 0.1199 - val_recall: 0.8977 - val_auc: 0.9782 - val_prc: 0.7055
Epoch 9/100
278/278 [==============================] - 9s 31ms/step - loss: 0.0956 - tp: 269858.0000 - fp: 6522.0000 - tn: 278173.0000 - fn: 14791.0000 - accuracy: 0.9626 - precision: 0.9764 - recall: 0.9480 - auc: 0.9952 - prc: 0.9950 - val_loss: 0.0516 - val_tp: 79.0000 - val_fp: 550.0000 - val_tn: 44931.0000 - val_fn: 9.0000 - val_accuracy: 0.9877 - val_precision: 0.1256 - val_recall: 0.8977 - val_auc: 0.9748 - val_prc: 0.6982
Epoch 10/100
278/278 [==============================] - 9s 33ms/step - loss: 0.0906 - tp: 270484.0000 - fp: 6297.0000 - tn: 278534.0000 - fn: 14029.0000 - accuracy: 0.9643 - precision: 0.9772 - recall: 0.9507 - auc: 0.9957 - prc: 0.9954 - val_loss: 0.0480 - val_tp: 79.0000 - val_fp: 533.0000 - val_tn: 44948.0000 - val_fn: 9.0000 - val_accuracy: 0.9881 - val_precision: 0.1291 - val_recall: 0.8977 - val_auc: 0.9711 - val_prc: 0.6974
Epoch 11/100
278/278 [==============================] - 9s 32ms/step - loss: 0.0872 - tp: 271073.0000 - fp: 6184.0000 - tn: 278743.0000 - fn: 13344.0000 - accuracy: 0.9657 - precision: 0.9777 - recall: 0.9531 - auc: 0.9959 - prc: 0.9955 - val_loss: 0.0451 - val_tp: 79.0000 - val_fp: 499.0000 - val_tn: 44982.0000 - val_fn: 9.0000 - val_accuracy: 0.9889 - val_precision: 0.1367 - val_recall: 0.8977 - val_auc: 0.9718 - val_prc: 0.6978
Epoch 12/100
278/278 [==============================] - 9s 31ms/step - loss: 0.0845 - tp: 272241.0000 - fp: 6027.0000 - tn: 278427.0000 - fn: 12649.0000 - accuracy: 0.9672 - precision: 0.9783 - recall: 0.9556 - auc: 0.9961 - prc: 0.9957 - val_loss: 0.0429 - val_tp: 80.0000 - val_fp: 491.0000 - val_tn: 44990.0000 - val_fn: 8.0000 - val_accuracy: 0.9890 - val_precision: 0.1401 - val_recall: 0.9091 - val_auc: 0.9724 - val_prc: 0.6985
Epoch 13/100
278/278 [==============================] - ETA: 0s - loss: 0.0819 - tp: 272096.0000 - fp: 6016.0000 - tn: 279208.0000 - fn: 12024.0000 - accuracy: 0.9683 - precision: 0.9784 - recall: 0.9577 - auc: 0.9964 - prc: 0.9959Restoring model weights from the end of the best epoch: 3.
278/278 [==============================] - 9s 31ms/step - loss: 0.0819 - tp: 272096.0000 - fp: 6016.0000 - tn: 279208.0000 - fn: 12024.0000 - accuracy: 0.9683 - precision: 0.9784 - recall: 0.9577 - auc: 0.9964 - prc: 0.9959 - val_loss: 0.0414 - val_tp: 80.0000 - val_fp: 481.0000 - val_tn: 45000.0000 - val_fn: 8.0000 - val_accuracy: 0.9893 - val_precision: 0.1426 - val_recall: 0.9091 - val_auc: 0.9725 - val_prc: 0.6919
Epoch 13: early stopping

אם תהליך האימון היה לוקח בחשבון את כל מערך הנתונים בכל עדכון שיפוע, דגימת יתר זו תהיה זהה בעצם לשקלול הכיתה.

אבל כשאמנים את הדגם מבחינה אצווה, כפי שעשיתם כאן, הנתונים שנדגמו יתר על המידה מספקים אות שיפוע חלק יותר: במקום שכל דוגמה חיובית תוצג באצווה אחת עם משקל גדול, הם מוצגים בקבוצות רבות ושונות בכל פעם עם משקל קטן.

אות שיפוע חלק יותר זה מקל על אימון הדגם.

בדוק את היסטוריית האימונים

שימו לב שההתפלגות המדדים תהיה שונה כאן, מכיוון שלנתוני ההדרכה יש התפלגות שונה לחלוטין מנתוני האימות והבדיקה.

plot_metrics(resampled_history)

png

הרכבת מחדש

מכיוון שהאימון קל יותר בנתונים המאוזנים, הליך האימון הנ"ל עלול להתאים במהירות.

אז חלקו את העידנים כדי לתת ל- tf.keras.callbacks.EarlyStopping שליטה עדינה יותר על מתי להפסיק את האימון.

resampled_model = make_model()
resampled_model.load_weights(initial_weights)

# Reset the bias to zero, since this dataset is balanced.
output_layer = resampled_model.layers[-1] 
output_layer.bias.assign([0])

resampled_history = resampled_model.fit(
    resampled_ds,
    # These are not real epochs
    steps_per_epoch=20,
    epochs=10*EPOCHS,
    callbacks=[early_stopping],
    validation_data=(val_ds))
Epoch 1/1000
20/20 [==============================] - 3s 82ms/step - loss: 1.3690 - tp: 8842.0000 - fp: 12784.0000 - tn: 53027.0000 - fn: 11876.0000 - accuracy: 0.7150 - precision: 0.4089 - recall: 0.4268 - auc: 0.7595 - prc: 0.5091 - val_loss: 0.8506 - val_tp: 67.0000 - val_fp: 29685.0000 - val_tn: 15796.0000 - val_fn: 21.0000 - val_accuracy: 0.3481 - val_precision: 0.0023 - val_recall: 0.7614 - val_auc: 0.6300 - val_prc: 0.0062
Epoch 2/1000
20/20 [==============================] - 1s 38ms/step - loss: 0.8734 - tp: 13189.0000 - fp: 12148.0000 - tn: 8360.0000 - fn: 7263.0000 - accuracy: 0.5261 - precision: 0.5205 - recall: 0.6449 - auc: 0.6060 - prc: 0.7169 - val_loss: 0.7935 - val_tp: 84.0000 - val_fp: 27319.0000 - val_tn: 18162.0000 - val_fn: 4.0000 - val_accuracy: 0.4004 - val_precision: 0.0031 - val_recall: 0.9545 - val_auc: 0.9085 - val_prc: 0.2791
Epoch 3/1000
20/20 [==============================] - 1s 37ms/step - loss: 0.6591 - tp: 15609.0000 - fp: 10735.0000 - tn: 9709.0000 - fn: 4907.0000 - accuracy: 0.6181 - precision: 0.5925 - recall: 0.7608 - auc: 0.7480 - prc: 0.8226 - val_loss: 0.7176 - val_tp: 82.0000 - val_fp: 22854.0000 - val_tn: 22627.0000 - val_fn: 6.0000 - val_accuracy: 0.4983 - val_precision: 0.0036 - val_recall: 0.9318 - val_auc: 0.9326 - val_prc: 0.5680
Epoch 4/1000
20/20 [==============================] - 1s 38ms/step - loss: 0.5539 - tp: 16763.0000 - fp: 9451.0000 - tn: 11067.0000 - fn: 3679.0000 - accuracy: 0.6794 - precision: 0.6395 - recall: 0.8200 - auc: 0.8230 - prc: 0.8754 - val_loss: 0.6429 - val_tp: 81.0000 - val_fp: 17203.0000 - val_tn: 28278.0000 - val_fn: 7.0000 - val_accuracy: 0.6223 - val_precision: 0.0047 - val_recall: 0.9205 - val_auc: 0.9382 - val_prc: 0.6592
Epoch 5/1000
20/20 [==============================] - 1s 38ms/step - loss: 0.4977 - tp: 17232.0000 - fp: 8173.0000 - tn: 12322.0000 - fn: 3233.0000 - accuracy: 0.7215 - precision: 0.6783 - recall: 0.8420 - auc: 0.8566 - prc: 0.8994 - val_loss: 0.5758 - val_tp: 81.0000 - val_fp: 12096.0000 - val_tn: 33385.0000 - val_fn: 7.0000 - val_accuracy: 0.7344 - val_precision: 0.0067 - val_recall: 0.9205 - val_auc: 0.9430 - val_prc: 0.6902
Epoch 6/1000
20/20 [==============================] - 1s 39ms/step - loss: 0.4476 - tp: 17521.0000 - fp: 6825.0000 - tn: 13574.0000 - fn: 3040.0000 - accuracy: 0.7592 - precision: 0.7197 - recall: 0.8521 - auc: 0.8820 - prc: 0.9173 - val_loss: 0.5188 - val_tp: 81.0000 - val_fp: 8455.0000 - val_tn: 37026.0000 - val_fn: 7.0000 - val_accuracy: 0.8143 - val_precision: 0.0095 - val_recall: 0.9205 - val_auc: 0.9474 - val_prc: 0.7144
Epoch 7/1000
20/20 [==============================] - 1s 38ms/step - loss: 0.4159 - tp: 17568.0000 - fp: 5807.0000 - tn: 14655.0000 - fn: 2930.0000 - accuracy: 0.7867 - precision: 0.7516 - recall: 0.8571 - auc: 0.8961 - prc: 0.9275 - val_loss: 0.4714 - val_tp: 81.0000 - val_fp: 6057.0000 - val_tn: 39424.0000 - val_fn: 7.0000 - val_accuracy: 0.8669 - val_precision: 0.0132 - val_recall: 0.9205 - val_auc: 0.9504 - val_prc: 0.7150
Epoch 8/1000
20/20 [==============================] - 1s 40ms/step - loss: 0.3878 - tp: 17820.0000 - fp: 4889.0000 - tn: 15384.0000 - fn: 2867.0000 - accuracy: 0.8106 - precision: 0.7847 - recall: 0.8614 - auc: 0.9084 - prc: 0.9366 - val_loss: 0.4308 - val_tp: 80.0000 - val_fp: 4423.0000 - val_tn: 41058.0000 - val_fn: 8.0000 - val_accuracy: 0.9028 - val_precision: 0.0178 - val_recall: 0.9091 - val_auc: 0.9521 - val_prc: 0.7212
Epoch 9/1000
20/20 [==============================] - 1s 40ms/step - loss: 0.3640 - tp: 17911.0000 - fp: 4232.0000 - tn: 16145.0000 - fn: 2672.0000 - accuracy: 0.8314 - precision: 0.8089 - recall: 0.8702 - auc: 0.9195 - prc: 0.9435 - val_loss: 0.3956 - val_tp: 80.0000 - val_fp: 3289.0000 - val_tn: 42192.0000 - val_fn: 8.0000 - val_accuracy: 0.9276 - val_precision: 0.0237 - val_recall: 0.9091 - val_auc: 0.9533 - val_prc: 0.7290
Epoch 10/1000
20/20 [==============================] - 1s 37ms/step - loss: 0.3349 - tp: 18031.0000 - fp: 3595.0000 - tn: 16870.0000 - fn: 2464.0000 - accuracy: 0.8521 - precision: 0.8338 - recall: 0.8798 - auc: 0.9315 - prc: 0.9515 - val_loss: 0.3639 - val_tp: 80.0000 - val_fp: 2497.0000 - val_tn: 42984.0000 - val_fn: 8.0000 - val_accuracy: 0.9450 - val_precision: 0.0310 - val_recall: 0.9091 - val_auc: 0.9548 - val_prc: 0.7271
Epoch 11/1000
20/20 [==============================] - 1s 37ms/step - loss: 0.3226 - tp: 18061.0000 - fp: 3143.0000 - tn: 17279.0000 - fn: 2477.0000 - accuracy: 0.8628 - precision: 0.8518 - recall: 0.8794 - auc: 0.9351 - prc: 0.9540 - val_loss: 0.3371 - val_tp: 80.0000 - val_fp: 2002.0000 - val_tn: 43479.0000 - val_fn: 8.0000 - val_accuracy: 0.9559 - val_precision: 0.0384 - val_recall: 0.9091 - val_auc: 0.9563 - val_prc: 0.7383
Epoch 12/1000
20/20 [==============================] - 1s 36ms/step - loss: 0.3051 - tp: 18156.0000 - fp: 2760.0000 - tn: 17604.0000 - fn: 2440.0000 - accuracy: 0.8730 - precision: 0.8680 - recall: 0.8815 - auc: 0.9409 - prc: 0.9580 - val_loss: 0.3144 - val_tp: 80.0000 - val_fp: 1694.0000 - val_tn: 43787.0000 - val_fn: 8.0000 - val_accuracy: 0.9627 - val_precision: 0.0451 - val_recall: 0.9091 - val_auc: 0.9575 - val_prc: 0.7333
Epoch 13/1000
20/20 [==============================] - 1s 37ms/step - loss: 0.2989 - tp: 17991.0000 - fp: 2630.0000 - tn: 17929.0000 - fn: 2410.0000 - accuracy: 0.8770 - precision: 0.8725 - recall: 0.8819 - auc: 0.9431 - prc: 0.9586 - val_loss: 0.2940 - val_tp: 80.0000 - val_fp: 1454.0000 - val_tn: 44027.0000 - val_fn: 8.0000 - val_accuracy: 0.9679 - val_precision: 0.0522 - val_recall: 0.9091 - val_auc: 0.9589 - val_prc: 0.7397
Epoch 14/1000
20/20 [==============================] - 1s 39ms/step - loss: 0.2803 - tp: 18224.0000 - fp: 2312.0000 - tn: 18068.0000 - fn: 2356.0000 - accuracy: 0.8860 - precision: 0.8874 - recall: 0.8855 - auc: 0.9496 - prc: 0.9634 - val_loss: 0.2759 - val_tp: 80.0000 - val_fp: 1319.0000 - val_tn: 44162.0000 - val_fn: 8.0000 - val_accuracy: 0.9709 - val_precision: 0.0572 - val_recall: 0.9091 - val_auc: 0.9603 - val_prc: 0.7473
Epoch 15/1000
20/20 [==============================] - 1s 38ms/step - loss: 0.2720 - tp: 18148.0000 - fp: 2078.0000 - tn: 18372.0000 - fn: 2362.0000 - accuracy: 0.8916 - precision: 0.8973 - recall: 0.8848 - auc: 0.9523 - prc: 0.9647 - val_loss: 0.2599 - val_tp: 80.0000 - val_fp: 1201.0000 - val_tn: 44280.0000 - val_fn: 8.0000 - val_accuracy: 0.9735 - val_precision: 0.0625 - val_recall: 0.9091 - val_auc: 0.9619 - val_prc: 0.7508
Epoch 16/1000
20/20 [==============================] - 1s 38ms/step - loss: 0.2597 - tp: 18246.0000 - fp: 1900.0000 - tn: 18545.0000 - fn: 2269.0000 - accuracy: 0.8982 - precision: 0.9057 - recall: 0.8894 - auc: 0.9568 - prc: 0.9680 - val_loss: 0.2450 - val_tp: 80.0000 - val_fp: 1102.0000 - val_tn: 44379.0000 - val_fn: 8.0000 - val_accuracy: 0.9756 - val_precision: 0.0677 - val_recall: 0.9091 - val_auc: 0.9632 - val_prc: 0.7551
Epoch 17/1000
20/20 [==============================] - 1s 37ms/step - loss: 0.2521 - tp: 18191.0000 - fp: 1700.0000 - tn: 18784.0000 - fn: 2285.0000 - accuracy: 0.9027 - precision: 0.9145 - recall: 0.8884 - auc: 0.9592 - prc: 0.9691 - val_loss: 0.2322 - val_tp: 80.0000 - val_fp: 1042.0000 - val_tn: 44439.0000 - val_fn: 8.0000 - val_accuracy: 0.9770 - val_precision: 0.0713 - val_recall: 0.9091 - val_auc: 0.9645 - val_prc: 0.7542
Epoch 18/1000
20/20 [==============================] - 1s 36ms/step - loss: 0.2451 - tp: 18311.0000 - fp: 1624.0000 - tn: 18736.0000 - fn: 2289.0000 - accuracy: 0.9045 - precision: 0.9185 - recall: 0.8889 - auc: 0.9606 - prc: 0.9706 - val_loss: 0.2209 - val_tp: 80.0000 - val_fp: 993.0000 - val_tn: 44488.0000 - val_fn: 8.0000 - val_accuracy: 0.9780 - val_precision: 0.0746 - val_recall: 0.9091 - val_auc: 0.9654 - val_prc: 0.7565
Epoch 19/1000
20/20 [==============================] - 1s 37ms/step - loss: 0.2398 - tp: 18368.0000 - fp: 1523.0000 - tn: 18859.0000 - fn: 2210.0000 - accuracy: 0.9089 - precision: 0.9234 - recall: 0.8926 - auc: 0.9629 - prc: 0.9719 - val_loss: 0.2107 - val_tp: 80.0000 - val_fp: 951.0000 - val_tn: 44530.0000 - val_fn: 8.0000 - val_accuracy: 0.9790 - val_precision: 0.0776 - val_recall: 0.9091 - val_auc: 0.9661 - val_prc: 0.7590
Epoch 20/1000
20/20 [==============================] - 1s 38ms/step - loss: 0.2301 - tp: 18391.0000 - fp: 1391.0000 - tn: 19017.0000 - fn: 2161.0000 - accuracy: 0.9133 - precision: 0.9297 - recall: 0.8949 - auc: 0.9655 - prc: 0.9737 - val_loss: 0.2017 - val_tp: 80.0000 - val_fp: 918.0000 - val_tn: 44563.0000 - val_fn: 8.0000 - val_accuracy: 0.9797 - val_precision: 0.0802 - val_recall: 0.9091 - val_auc: 0.9673 - val_prc: 0.7596
Epoch 21/1000
20/20 [==============================] - 1s 37ms/step - loss: 0.2288 - tp: 18311.0000 - fp: 1344.0000 - tn: 19153.0000 - fn: 2152.0000 - accuracy: 0.9146 - precision: 0.9316 - recall: 0.8948 - auc: 0.9663 - prc: 0.9738 - val_loss: 0.1924 - val_tp: 80.0000 - val_fp: 893.0000 - val_tn: 44588.0000 - val_fn: 8.0000 - val_accuracy: 0.9802 - val_precision: 0.0822 - val_recall: 0.9091 - val_auc: 0.9684 - val_prc: 0.7604
Epoch 22/1000
20/20 [==============================] - 1s 37ms/step - loss: 0.2178 - tp: 18505.0000 - fp: 1193.0000 - tn: 19162.0000 - fn: 2100.0000 - accuracy: 0.9196 - precision: 0.9394 - recall: 0.8981 - auc: 0.9691 - prc: 0.9764 - val_loss: 0.1836 - val_tp: 80.0000 - val_fp: 854.0000 - val_tn: 44627.0000 - val_fn: 8.0000 - val_accuracy: 0.9811 - val_precision: 0.0857 - val_recall: 0.9091 - val_auc: 0.9695 - val_prc: 0.7606
Epoch 23/1000
20/20 [==============================] - 1s 37ms/step - loss: 0.2118 - tp: 18459.0000 - fp: 1173.0000 - tn: 19289.0000 - fn: 2039.0000 - accuracy: 0.9216 - precision: 0.9403 - recall: 0.9005 - auc: 0.9708 - prc: 0.9773 - val_loss: 0.1757 - val_tp: 81.0000 - val_fp: 848.0000 - val_tn: 44633.0000 - val_fn: 7.0000 - val_accuracy: 0.9812 - val_precision: 0.0872 - val_recall: 0.9205 - val_auc: 0.9702 - val_prc: 0.7610
Epoch 24/1000
20/20 [==============================] - 1s 37ms/step - loss: 0.2081 - tp: 18643.0000 - fp: 1146.0000 - tn: 19209.0000 - fn: 1962.0000 - accuracy: 0.9241 - precision: 0.9421 - recall: 0.9048 - auc: 0.9725 - prc: 0.9787 - val_loss: 0.1677 - val_tp: 81.0000 - val_fp: 818.0000 - val_tn: 44663.0000 - val_fn: 7.0000 - val_accuracy: 0.9819 - val_precision: 0.0901 - val_recall: 0.9205 - val_auc: 0.9711 - val_prc: 0.7629
Epoch 25/1000
20/20 [==============================] - 1s 37ms/step - loss: 0.2020 - tp: 18600.0000 - fp: 1069.0000 - tn: 19308.0000 - fn: 1983.0000 - accuracy: 0.9255 - precision: 0.9457 - recall: 0.9037 - auc: 0.9739 - prc: 0.9796 - val_loss: 0.1609 - val_tp: 81.0000 - val_fp: 811.0000 - val_tn: 44670.0000 - val_fn: 7.0000 - val_accuracy: 0.9820 - val_precision: 0.0908 - val_recall: 0.9205 - val_auc: 0.9717 - val_prc: 0.7633
Epoch 26/1000
20/20 [==============================] - 1s 36ms/step - loss: 0.1964 - tp: 18548.0000 - fp: 1001.0000 - tn: 19529.0000 - fn: 1882.0000 - accuracy: 0.9296 - precision: 0.9488 - recall: 0.9079 - auc: 0.9755 - prc: 0.9805 - val_loss: 0.1537 - val_tp: 81.0000 - val_fp: 779.0000 - val_tn: 44702.0000 - val_fn: 7.0000 - val_accuracy: 0.9828 - val_precision: 0.0942 - val_recall: 0.9205 - val_auc: 0.9726 - val_prc: 0.7570
Epoch 27/1000
20/20 [==============================] - 1s 38ms/step - loss: 0.1915 - tp: 18618.0000 - fp: 946.0000 - tn: 19568.0000 - fn: 1828.0000 - accuracy: 0.9323 - precision: 0.9516 - recall: 0.9106 - auc: 0.9766 - prc: 0.9815 - val_loss: 0.1478 - val_tp: 81.0000 - val_fp: 768.0000 - val_tn: 44713.0000 - val_fn: 7.0000 - val_accuracy: 0.9830 - val_precision: 0.0954 - val_recall: 0.9205 - val_auc: 0.9732 - val_prc: 0.7571
Epoch 28/1000
20/20 [==============================] - 1s 36ms/step - loss: 0.1903 - tp: 18731.0000 - fp: 920.0000 - tn: 19487.0000 - fn: 1822.0000 - accuracy: 0.9331 - precision: 0.9532 - recall: 0.9114 - auc: 0.9766 - prc: 0.9815 - val_loss: 0.1422 - val_tp: 81.0000 - val_fp: 763.0000 - val_tn: 44718.0000 - val_fn: 7.0000 - val_accuracy: 0.9831 - val_precision: 0.0960 - val_recall: 0.9205 - val_auc: 0.9738 - val_prc: 0.7571
Epoch 29/1000
20/20 [==============================] - 1s 37ms/step - loss: 0.1868 - tp: 18519.0000 - fp: 929.0000 - tn: 19714.0000 - fn: 1798.0000 - accuracy: 0.9334 - precision: 0.9522 - recall: 0.9115 - auc: 0.9776 - prc: 0.9819 - val_loss: 0.1369 - val_tp: 81.0000 - val_fp: 745.0000 - val_tn: 44736.0000 - val_fn: 7.0000 - val_accuracy: 0.9835 - val_precision: 0.0981 - val_recall: 0.9205 - val_auc: 0.9741 - val_prc: 0.7573
Epoch 30/1000
20/20 [==============================] - 1s 37ms/step - loss: 0.1806 - tp: 18708.0000 - fp: 877.0000 - tn: 19617.0000 - fn: 1758.0000 - accuracy: 0.9357 - precision: 0.9552 - recall: 0.9141 - auc: 0.9791 - prc: 0.9832 - val_loss: 0.1324 - val_tp: 80.0000 - val_fp: 743.0000 - val_tn: 44738.0000 - val_fn: 8.0000 - val_accuracy: 0.9835 - val_precision: 0.0972 - val_recall: 0.9091 - val_auc: 0.9742 - val_prc: 0.7571
Epoch 31/1000
20/20 [==============================] - 1s 36ms/step - loss: 0.1742 - tp: 18474.0000 - fp: 800.0000 - tn: 19976.0000 - fn: 1710.0000 - accuracy: 0.9387 - precision: 0.9585 - recall: 0.9153 - auc: 0.9807 - prc: 0.9841 - val_loss: 0.1276 - val_tp: 80.0000 - val_fp: 723.0000 - val_tn: 44758.0000 - val_fn: 8.0000 - val_accuracy: 0.9840 - val_precision: 0.0996 - val_recall: 0.9091 - val_auc: 0.9746 - val_prc: 0.7573
Epoch 32/1000
20/20 [==============================] - 1s 36ms/step - loss: 0.1741 - tp: 18801.0000 - fp: 799.0000 - tn: 19630.0000 - fn: 1730.0000 - accuracy: 0.9383 - precision: 0.9592 - recall: 0.9157 - auc: 0.9805 - prc: 0.9846 - val_loss: 0.1229 - val_tp: 80.0000 - val_fp: 702.0000 - val_tn: 44779.0000 - val_fn: 8.0000 - val_accuracy: 0.9844 - val_precision: 0.1023 - val_recall: 0.9091 - val_auc: 0.9749 - val_prc: 0.7573
Epoch 33/1000
20/20 [==============================] - 1s 37ms/step - loss: 0.1712 - tp: 18747.0000 - fp: 761.0000 - tn: 19759.0000 - fn: 1693.0000 - accuracy: 0.9401 - precision: 0.9610 - recall: 0.9172 - auc: 0.9812 - prc: 0.9847 - val_loss: 0.1195 - val_tp: 80.0000 - val_fp: 700.0000 - val_tn: 44781.0000 - val_fn: 8.0000 - val_accuracy: 0.9845 - val_precision: 0.1026 - val_recall: 0.9091 - val_auc: 0.9755 - val_prc: 0.7570
Epoch 34/1000
20/20 [==============================] - 1s 39ms/step - loss: 0.1696 - tp: 18751.0000 - fp: 757.0000 - tn: 19694.0000 - fn: 1758.0000 - accuracy: 0.9386 - precision: 0.9612 - recall: 0.9143 - auc: 0.9814 - prc: 0.9849 - val_loss: 0.1164 - val_tp: 80.0000 - val_fp: 703.0000 - val_tn: 44778.0000 - val_fn: 8.0000 - val_accuracy: 0.9844 - val_precision: 0.1022 - val_recall: 0.9091 - val_auc: 0.9757 - val_prc: 0.7572
Epoch 35/1000
20/20 [==============================] - ETA: 0s - loss: 0.1651 - tp: 18897.0000 - fp: 768.0000 - tn: 19603.0000 - fn: 1692.0000 - accuracy: 0.9399 - precision: 0.9609 - recall: 0.9178 - auc: 0.9825 - prc: 0.9857Restoring model weights from the end of the best epoch: 25.
20/20 [==============================] - 1s 39ms/step - loss: 0.1651 - tp: 18897.0000 - fp: 768.0000 - tn: 19603.0000 - fn: 1692.0000 - accuracy: 0.9399 - precision: 0.9609 - recall: 0.9178 - auc: 0.9825 - prc: 0.9857 - val_loss: 0.1134 - val_tp: 79.0000 - val_fp: 712.0000 - val_tn: 44769.0000 - val_fn: 9.0000 - val_accuracy: 0.9842 - val_precision: 0.0999 - val_recall: 0.8977 - val_auc: 0.9757 - val_prc: 0.7614
Epoch 35: early stopping

בדוק שוב את היסטוריית האימונים

plot_metrics(resampled_history)

png

הערכת מדדים

train_predictions_resampled = resampled_model.predict(train_features, batch_size=BATCH_SIZE)
test_predictions_resampled = resampled_model.predict(test_features, batch_size=BATCH_SIZE)
resampled_results = resampled_model.evaluate(test_features, test_labels,
                                             batch_size=BATCH_SIZE, verbose=0)
for name, value in zip(resampled_model.metrics_names, resampled_results):
  print(name, ': ', value)
print()

plot_cm(test_labels, test_predictions_resampled)
loss :  0.16031159460544586
tp :  93.0
fp :  1035.0
tn :  55823.0
fn :  11.0
accuracy :  0.9816368818283081
precision :  0.08244680613279343
recall :  0.8942307829856873
auc :  0.9773780703544617
prc :  0.7749922275543213

Legitimate Transactions Detected (True Negatives):  55823
Legitimate Transactions Incorrectly Detected (False Positives):  1035
Fraudulent Transactions Missed (False Negatives):  11
Fraudulent Transactions Detected (True Positives):  93
Total Fraudulent Transactions:  104

png

תכנן את ה-ROC

plot_roc("Train Baseline", train_labels, train_predictions_baseline, color=colors[0])
plot_roc("Test Baseline", test_labels, test_predictions_baseline, color=colors[0], linestyle='--')

plot_roc("Train Weighted", train_labels, train_predictions_weighted, color=colors[1])
plot_roc("Test Weighted", test_labels, test_predictions_weighted, color=colors[1], linestyle='--')

plot_roc("Train Resampled", train_labels, train_predictions_resampled, color=colors[2])
plot_roc("Test Resampled", test_labels, test_predictions_resampled, color=colors[2], linestyle='--')
plt.legend(loc='lower right');

png

תכננו את AUPRC

plot_prc("Train Baseline", train_labels, train_predictions_baseline, color=colors[0])
plot_prc("Test Baseline", test_labels, test_predictions_baseline, color=colors[0], linestyle='--')

plot_prc("Train Weighted", train_labels, train_predictions_weighted, color=colors[1])
plot_prc("Test Weighted", test_labels, test_predictions_weighted, color=colors[1], linestyle='--')

plot_prc("Train Resampled", train_labels, train_predictions_resampled, color=colors[2])
plot_prc("Test Resampled", test_labels, test_predictions_resampled, color=colors[2], linestyle='--')
plt.legend(loc='lower right');

png

החלת הדרכה זו על הבעיה שלך

סיווג נתונים לא מאוזן הוא משימה קשה מטבעה מכיוון שיש כל כך מעט דוגמאות ללמוד מהן. אתה תמיד צריך להתחיל עם הנתונים תחילה ולעשות כמיטב יכולתך לאסוף כמה שיותר דגימות ולהקדיש מחשבה מהותית לאילו תכונות עשויות להיות רלוונטיות כדי שהמודל יוכל להפיק את המרב ממעמד המיעוט שלך. בשלב מסוים המודל שלך עשוי להתקשה להשתפר ולהניב את התוצאות הרצויות לך, לכן חשוב לזכור את ההקשר של הבעיה שלך ואת ההחלפות בין סוגים שונים של שגיאות.