Unisciti alla comunità SIG TFX-Addons e contribuisci a rendere TFX ancora migliore!
Questa pagina è stata tradotta dall'API Cloud Translation.
Switch to English

Tutorial sul componente Keras di TFX

Un'introduzione componente per componente a TensorFlow Extended (TFX)

Questo tutorial basato su Colab illustrerà in modo interattivo ogni componente integrato di TensorFlow Extended (TFX).

Copre ogni passaggio di una pipeline di machine learning end-to-end, dall'inserimento dei dati al trasferimento di un modello alla pubblicazione.

Quando hai finito, il contenuto di questo notebook può essere esportato automaticamente come codice sorgente della pipeline TFX, che puoi orchestrare con Apache Airflow e Apache Beam.

sfondo

Questo notebook mostra come utilizzare TFX in un ambiente Jupyter / Colab. Qui, esaminiamo l'esempio di Chicago Taxi in un taccuino interattivo.

Lavorare in un taccuino interattivo è un modo utile per acquisire familiarità con la struttura di una pipeline TFX. È anche utile quando si esegue lo sviluppo delle proprie pipeline come ambiente di sviluppo leggero, ma è necessario tenere presente che esistono differenze nel modo in cui vengono orchestrati i blocchi appunti interattivi e nel modo in cui accedono agli artefatti dei metadati.

Orchestrazione

In una distribuzione di produzione di TFX, utilizzerai un orchestrator come Apache Airflow, Kubeflow Pipelines o Apache Beam per orchestrare un grafico pipeline predefinito di componenti TFX. In un taccuino interattivo, il taccuino stesso è l'orchestratore, eseguendo ogni componente TFX mentre esegui le celle del taccuino.

Metadati

In una distribuzione di produzione di TFX, accederai ai metadati tramite l'API MLMD (ML Metadata). MLMD memorizza le proprietà dei metadati in un database come MySQL o SQLite e archivia i payload dei metadati in un archivio persistente come nel tuo filesystem. In un notebook interattivo, sia le proprietà che i payload sono archiviati in un database SQLite temporaneo nella /tmp sul notebook Jupyter o sul server Colab.

Impostare

Innanzitutto, installiamo e importiamo i pacchetti necessari, configuriamo i percorsi e scarichiamo i dati.

Aggiorna Pip

Per evitare di aggiornare Pip in un sistema quando è in esecuzione localmente, assicurati che sia in esecuzione in Colab. I sistemi locali possono ovviamente essere aggiornati separatamente.

try:
  import colab
  !pip install --upgrade pip
except:
  pass

Installa TFX

# TODO(b/178712706): Stop using legacy resolver after PIP issue is resolved.
pip install -q -U --use-deprecated=legacy-resolver tfx

Hai riavviato il runtime?

Se stai utilizzando Google Colab, la prima volta che esegui la cella sopra, devi riavviare il runtime (Runtime> Riavvia runtime ...). Ciò è dovuto al modo in cui Colab carica i pacchetti.

Importa pacchetti

Importiamo i pacchetti necessari, comprese le classi di componenti TFX standard.

import os
import pprint
import tempfile
import urllib

import absl
import tensorflow as tf
import tensorflow_model_analysis as tfma
tf.get_logger().propagate = False
pp = pprint.PrettyPrinter()

import tfx
from tfx.components import CsvExampleGen
from tfx.components import Evaluator
from tfx.components import ExampleValidator
from tfx.components import Pusher
from tfx.components import SchemaGen
from tfx.components import StatisticsGen
from tfx.components import Trainer
from tfx.components import Transform
from tfx.components.base import executor_spec
from tfx.components.trainer.executor import GenericExecutor
from tfx.dsl.components.common import resolver
from tfx.dsl.experimental import latest_blessed_model_resolver
from tfx.orchestration import metadata
from tfx.orchestration import pipeline
from tfx.orchestration.experimental.interactive.interactive_context import InteractiveContext
from tfx.proto import pusher_pb2
from tfx.proto import trainer_pb2
from tfx.types import Channel
from tfx.types.standard_artifacts import Model
from tfx.types.standard_artifacts import ModelBlessing


%load_ext tfx.orchestration.experimental.interactive.notebook_extensions.skip
WARNING:absl:RuntimeParameter is only supported on Cloud-based DAG runner currently.

Controlliamo le versioni della libreria.

print('TensorFlow version: {}'.format(tf.__version__))
print('TFX version: {}'.format(tfx.__version__))
TensorFlow version: 2.4.1
TFX version: 0.29.0

Imposta i percorsi della pipeline

# This is the root directory for your TFX pip package installation.
_tfx_root = tfx.__path__[0]

# This is the directory containing the TFX Chicago Taxi Pipeline example.
_taxi_root = os.path.join(_tfx_root, 'examples/chicago_taxi_pipeline')

# This is the path where your model will be pushed for serving.
_serving_model_dir = os.path.join(
    tempfile.mkdtemp(), 'serving_model/taxi_simple')

# Set up logging.
absl.logging.set_verbosity(absl.logging.INFO)

Scarica dati di esempio

Scarichiamo il set di dati di esempio da utilizzare nella nostra pipeline TFX.

Il set di dati che stiamo utilizzando è il set di dati di Taxi Trips rilasciato dalla città di Chicago. Le colonne in questo set di dati sono:

pickup_community_area fare trip_start_month
trip_start_hour trip_start_day trip_start_timestamp
pickup_latitude pickup_longitude dropoff_latitude
dropoff_longitude trip_miles pickup_census_tract
dropoff_census_tract modalità di pagamento azienda
trip_seconds dropoff_community_area suggerimenti

Con questo set di dati, costruiremo un modello che prevede i tips di un viaggio.

_data_root = tempfile.mkdtemp(prefix='tfx-data')
DATA_PATH = 'https://raw.githubusercontent.com/tensorflow/tfx/master/tfx/examples/chicago_taxi_pipeline/data/simple/data.csv'
_data_filepath = os.path.join(_data_root, "data.csv")
urllib.request.urlretrieve(DATA_PATH, _data_filepath)
('/tmp/tfx-data6yu2nd7c/data.csv', <http.client.HTTPMessage at 0x7fd47d1b71d0>)

Dai una rapida occhiata al file CSV.

head {_data_filepath}
pickup_community_area,fare,trip_start_month,trip_start_hour,trip_start_day,trip_start_timestamp,pickup_latitude,pickup_longitude,dropoff_latitude,dropoff_longitude,trip_miles,pickup_census_tract,dropoff_census_tract,payment_type,company,trip_seconds,dropoff_community_area,tips
,12.45,5,19,6,1400269500,,,,,0.0,,,Credit Card,Chicago Elite Cab Corp. (Chicago Carriag,0,,0.0
,0,3,19,5,1362683700,,,,,0,,,Unknown,Chicago Elite Cab Corp.,300,,0
60,27.05,10,2,3,1380593700,41.836150155,-87.648787952,,,12.6,,,Cash,Taxi Affiliation Services,1380,,0.0
10,5.85,10,1,2,1382319000,41.985015101,-87.804532006,,,0.0,,,Cash,Taxi Affiliation Services,180,,0.0
14,16.65,5,7,5,1369897200,41.968069,-87.721559063,,,0.0,,,Cash,Dispatch Taxi Affiliation,1080,,0.0
13,16.45,11,12,3,1446554700,41.983636307,-87.723583185,,,6.9,,,Cash,,780,,0.0
16,32.05,12,1,1,1417916700,41.953582125,-87.72345239,,,15.4,,,Cash,,1200,,0.0
30,38.45,10,10,5,1444301100,41.839086906,-87.714003807,,,14.6,,,Cash,,2580,,0.0
11,14.65,1,1,3,1358213400,41.978829526,-87.771166703,,,5.81,,,Cash,,1080,,0.0

Dichiarazione di non responsabilità: questo sito fornisce applicazioni utilizzando dati che sono stati modificati per essere utilizzati dalla sua fonte originale, www.cityofchicago.org, il sito web ufficiale della città di Chicago. La città di Chicago non rivendica il contenuto, l'accuratezza, la tempestività o la completezza dei dati forniti in questo sito. I dati forniti in questo sito possono essere modificati in qualsiasi momento. Resta inteso che i dati forniti in questo sito vengono utilizzati a proprio rischio.

Crea il InteractiveContext

Infine, creiamo un InteractiveContext, che ci consentirà di eseguire i componenti TFX in modo interattivo in questo notebook.

# Here, we create an InteractiveContext using default parameters. This will
# use a temporary directory with an ephemeral ML Metadata database instance.
# To use your own pipeline root or database, the optional properties
# `pipeline_root` and `metadata_connection_config` may be passed to
# InteractiveContext. Calls to InteractiveContext are no-ops outside of the
# notebook.
context = InteractiveContext()
WARNING:absl:InteractiveContext pipeline_root argument not provided: using temporary directory /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2 as root for pipeline outputs.
WARNING:absl:InteractiveContext metadata_connection_config not provided: using SQLite ML Metadata database at /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/metadata.sqlite.

Esegui i componenti TFX in modo interattivo

Nelle celle che seguono, creiamo i componenti TFX uno per uno, li eseguiamo e visualizziamo i loro artefatti di output.

ExampleGen

Il componente ExampleGen è solitamente all'inizio di una pipeline TFX. Lo farà:

  1. Suddividi i dati in set di addestramento e valutazione (per impostazione predefinita, 2/3 di addestramento + 1/3 di valutazione)
  2. Converti i dati nel formato tf.Example
  3. Copiare i dati nella directory _tfx_root per _tfx_root l' _tfx_root ad altri componenti

ExampleGen prende come input il percorso dell'origine dati. Nel nostro caso, questo è il percorso _data_root che contiene il CSV scaricato.

example_gen = CsvExampleGen(input_base=_data_root)
context.run(example_gen)
INFO:absl:Running driver for CsvExampleGen
INFO:absl:MetadataStore with DB connection initialized
INFO:absl:select span and version = (0, None)
INFO:absl:latest span and version = (0, None)
INFO:absl:Running executor for CsvExampleGen
INFO:absl:Generating examples.
WARNING:apache_beam.runners.interactive.interactive_environment:Dependencies required for Interactive Beam PCollection visualization are not available, please use: `pip install apache-beam[interactive]` to install necessary dependencies to enable all data visualization features.
INFO:absl:Processing input csv data /tmp/tfx-data6yu2nd7c/* to TFExample.
WARNING:apache_beam.io.tfrecordio:Couldn't find python-snappy so the implementation of _TFRecordUtil._masked_crc32c is not as fast as it could be.
INFO:absl:Examples generated.
INFO:absl:Running publisher for CsvExampleGen
INFO:absl:MetadataStore with DB connection initialized

Esaminiamo gli artefatti di output di ExampleGen . Questo componente produce due artefatti, esempi di formazione ed esempi di valutazione:

artifact = example_gen.outputs['examples'].get()[0]
print(artifact.split_names, artifact.uri)
["train", "eval"] /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/CsvExampleGen/examples/1

Possiamo anche dare un'occhiata ai primi tre esempi di formazione:

# Get the URI of the output artifact representing the training examples, which is a directory
train_uri = os.path.join(example_gen.outputs['examples'].get()[0].uri, 'Split-train')

# Get the list of files in this directory (all compressed TFRecord files)
tfrecord_filenames = [os.path.join(train_uri, name)
                      for name in os.listdir(train_uri)]

# Create a `TFRecordDataset` to read these files
dataset = tf.data.TFRecordDataset(tfrecord_filenames, compression_type="GZIP")

# Iterate over the first 3 records and decode them.
for tfrecord in dataset.take(3):
  serialized_example = tfrecord.numpy()
  example = tf.train.Example()
  example.ParseFromString(serialized_example)
  pp.pprint(example)
features {
  feature {
    key: "company"
    value {
      bytes_list {
        value: "Chicago Elite Cab Corp. (Chicago Carriag"
      }
    }
  }
  feature {
    key: "dropoff_census_tract"
    value {
      int64_list {
      }
    }
  }
  feature {
    key: "dropoff_community_area"
    value {
      int64_list {
      }
    }
  }
  feature {
    key: "dropoff_latitude"
    value {
      float_list {
      }
    }
  }
  feature {
    key: "dropoff_longitude"
    value {
      float_list {
      }
    }
  }
  feature {
    key: "fare"
    value {
      float_list {
        value: 12.449999809265137
      }
    }
  }
  feature {
    key: "payment_type"
    value {
      bytes_list {
        value: "Credit Card"
      }
    }
  }
  feature {
    key: "pickup_census_tract"
    value {
      int64_list {
      }
    }
  }
  feature {
    key: "pickup_community_area"
    value {
      int64_list {
      }
    }
  }
  feature {
    key: "pickup_latitude"
    value {
      float_list {
      }
    }
  }
  feature {
    key: "pickup_longitude"
    value {
      float_list {
      }
    }
  }
  feature {
    key: "tips"
    value {
      float_list {
        value: 0.0
      }
    }
  }
  feature {
    key: "trip_miles"
    value {
      float_list {
        value: 0.0
      }
    }
  }
  feature {
    key: "trip_seconds"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "trip_start_day"
    value {
      int64_list {
        value: 6
      }
    }
  }
  feature {
    key: "trip_start_hour"
    value {
      int64_list {
        value: 19
      }
    }
  }
  feature {
    key: "trip_start_month"
    value {
      int64_list {
        value: 5
      }
    }
  }
  feature {
    key: "trip_start_timestamp"
    value {
      int64_list {
        value: 1400269500
      }
    }
  }
}

features {
  feature {
    key: "company"
    value {
      bytes_list {
        value: "Taxi Affiliation Services"
      }
    }
  }
  feature {
    key: "dropoff_census_tract"
    value {
      int64_list {
      }
    }
  }
  feature {
    key: "dropoff_community_area"
    value {
      int64_list {
      }
    }
  }
  feature {
    key: "dropoff_latitude"
    value {
      float_list {
      }
    }
  }
  feature {
    key: "dropoff_longitude"
    value {
      float_list {
      }
    }
  }
  feature {
    key: "fare"
    value {
      float_list {
        value: 27.049999237060547
      }
    }
  }
  feature {
    key: "payment_type"
    value {
      bytes_list {
        value: "Cash"
      }
    }
  }
  feature {
    key: "pickup_census_tract"
    value {
      int64_list {
      }
    }
  }
  feature {
    key: "pickup_community_area"
    value {
      int64_list {
        value: 60
      }
    }
  }
  feature {
    key: "pickup_latitude"
    value {
      float_list {
        value: 41.836151123046875
      }
    }
  }
  feature {
    key: "pickup_longitude"
    value {
      float_list {
        value: -87.64878845214844
      }
    }
  }
  feature {
    key: "tips"
    value {
      float_list {
        value: 0.0
      }
    }
  }
  feature {
    key: "trip_miles"
    value {
      float_list {
        value: 12.600000381469727
      }
    }
  }
  feature {
    key: "trip_seconds"
    value {
      int64_list {
        value: 1380
      }
    }
  }
  feature {
    key: "trip_start_day"
    value {
      int64_list {
        value: 3
      }
    }
  }
  feature {
    key: "trip_start_hour"
    value {
      int64_list {
        value: 2
      }
    }
  }
  feature {
    key: "trip_start_month"
    value {
      int64_list {
        value: 10
      }
    }
  }
  feature {
    key: "trip_start_timestamp"
    value {
      int64_list {
        value: 1380593700
      }
    }
  }
}

features {
  feature {
    key: "company"
    value {
      bytes_list {
      }
    }
  }
  feature {
    key: "dropoff_census_tract"
    value {
      int64_list {
      }
    }
  }
  feature {
    key: "dropoff_community_area"
    value {
      int64_list {
      }
    }
  }
  feature {
    key: "dropoff_latitude"
    value {
      float_list {
      }
    }
  }
  feature {
    key: "dropoff_longitude"
    value {
      float_list {
      }
    }
  }
  feature {
    key: "fare"
    value {
      float_list {
        value: 16.450000762939453
      }
    }
  }
  feature {
    key: "payment_type"
    value {
      bytes_list {
        value: "Cash"
      }
    }
  }
  feature {
    key: "pickup_census_tract"
    value {
      int64_list {
      }
    }
  }
  feature {
    key: "pickup_community_area"
    value {
      int64_list {
        value: 13
      }
    }
  }
  feature {
    key: "pickup_latitude"
    value {
      float_list {
        value: 41.98363494873047
      }
    }
  }
  feature {
    key: "pickup_longitude"
    value {
      float_list {
        value: -87.72357940673828
      }
    }
  }
  feature {
    key: "tips"
    value {
      float_list {
        value: 0.0
      }
    }
  }
  feature {
    key: "trip_miles"
    value {
      float_list {
        value: 6.900000095367432
      }
    }
  }
  feature {
    key: "trip_seconds"
    value {
      int64_list {
        value: 780
      }
    }
  }
  feature {
    key: "trip_start_day"
    value {
      int64_list {
        value: 3
      }
    }
  }
  feature {
    key: "trip_start_hour"
    value {
      int64_list {
        value: 12
      }
    }
  }
  feature {
    key: "trip_start_month"
    value {
      int64_list {
        value: 11
      }
    }
  }
  feature {
    key: "trip_start_timestamp"
    value {
      int64_list {
        value: 1446554700
      }
    }
  }
}

Ora che ExampleGen ha terminato di ExampleGen i dati, il passaggio successivo è l'analisi dei dati.

StatisticsGen

Il componente StatisticsGen calcola le statistiche sul set di dati per l'analisi dei dati, nonché per l'utilizzo nei componenti a valle. Utilizza la libreria di convalida dei dati TensorFlow .

StatisticsGen prende come input il set di dati che abbiamo appena ingerito utilizzando ExampleGen .

statistics_gen = StatisticsGen(
    examples=example_gen.outputs['examples'])
context.run(statistics_gen)
INFO:absl:Excluding no splits because exclude_splits is not set.
INFO:absl:Running driver for StatisticsGen
INFO:absl:MetadataStore with DB connection initialized
INFO:absl:Running executor for StatisticsGen
INFO:absl:Generating statistics for split train.
INFO:absl:Statistics for split train written to /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/StatisticsGen/statistics/2/Split-train.
INFO:absl:Generating statistics for split eval.
INFO:absl:Statistics for split eval written to /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/StatisticsGen/statistics/2/Split-eval.
INFO:absl:Running publisher for StatisticsGen
INFO:absl:MetadataStore with DB connection initialized

Al termine dell'esecuzione di StatisticsGen , possiamo visualizzare le statistiche emesse. Prova a giocare con le diverse trame!

context.show(statistics_gen.outputs['statistics'])

SchemaGen

Il componente SchemaGen genera uno schema basato sulle statistiche dei dati. (Uno schema definisce i limiti, i tipi e le proprietà previsti delle funzionalità nel set di dati.) Utilizza anche la libreria di convalida dei dati TensorFlow .

SchemaGen prenderà come input le statistiche che abbiamo generato con StatisticsGen , guardando la divisione dell'addestramento per impostazione predefinita.

schema_gen = SchemaGen(
    statistics=statistics_gen.outputs['statistics'],
    infer_feature_shape=False)
context.run(schema_gen)
INFO:absl:Excluding no splits because exclude_splits is not set.
INFO:absl:Running driver for SchemaGen
INFO:absl:MetadataStore with DB connection initialized
INFO:absl:Running executor for SchemaGen
INFO:absl:Processing schema from statistics for split train.
INFO:absl:Processing schema from statistics for split eval.
INFO:absl:Schema written to /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/SchemaGen/schema/3/schema.pbtxt.
INFO:absl:Running publisher for SchemaGen
INFO:absl:MetadataStore with DB connection initialized

Al termine dell'esecuzione di SchemaGen , possiamo visualizzare lo schema generato come tabella.

context.show(schema_gen.outputs['schema'])

Ogni funzionalità nel tuo set di dati viene visualizzata come una riga nella tabella dello schema, insieme alle sue proprietà. Lo schema acquisisce anche tutti i valori che assume una caratteristica categoriale, indicati come il suo dominio.

Per ulteriori informazioni sugli schemi, vedere la documentazione di SchemaGen .

ExampleValidator

Il componente ExampleValidator rileva le anomalie nei dati, in base alle aspettative definite dallo schema. Utilizza anche la libreria di convalida dei dati TensorFlow .

ExampleValidator prenderà come input le statistiche da StatisticsGen e lo schema da SchemaGen .

example_validator = ExampleValidator(
    statistics=statistics_gen.outputs['statistics'],
    schema=schema_gen.outputs['schema'])
context.run(example_validator)
INFO:absl:Excluding no splits because exclude_splits is not set.
INFO:absl:Running driver for ExampleValidator
INFO:absl:MetadataStore with DB connection initialized
INFO:absl:Running executor for ExampleValidator
INFO:absl:Validating schema against the computed statistics for split train.
INFO:absl:Validation complete for split train. Anomalies written to /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/ExampleValidator/anomalies/4/Split-train.
INFO:absl:Validating schema against the computed statistics for split eval.
INFO:absl:Validation complete for split eval. Anomalies written to /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/ExampleValidator/anomalies/4/Split-eval.
INFO:absl:Running publisher for ExampleValidator
INFO:absl:MetadataStore with DB connection initialized

Al termine dell'esecuzione di ExampleValidator , possiamo visualizzare le anomalie come una tabella.

context.show(example_validator.outputs['anomalies'])
/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow_data_validation/utils/display_util.py:188: FutureWarning: Passing a negative integer is deprecated in version 1.0 and will not be supported in future version. Instead, use None to not limit the column width.
  pd.set_option('max_colwidth', -1)

Nella tabella delle anomalie, possiamo vedere che non ci sono anomalie. Questo è ciò che ci aspetteremmo, poiché questo è il primo set di dati che abbiamo analizzato e lo schema è adattato ad esso. Dovresti rivedere questo schema: qualsiasi cosa imprevista significa un'anomalia nei dati. Una volta esaminato, lo schema può essere utilizzato per proteggere i dati futuri e le anomalie qui prodotte possono essere utilizzate per eseguire il debug delle prestazioni del modello, comprendere come i dati si evolvono nel tempo e identificare gli errori dei dati.

Trasformare

Il componente Transform esegue l'ingegnerizzazione delle funzionalità sia per l'addestramento che per la pubblicazione. Utilizza la libreria TensorFlow Transform .

Transform prenderà come input i dati da ExampleGen , lo schema da SchemaGen , nonché un modulo che contiene il codice Transform definito dall'utente.

Vediamo un esempio di codice Transform definito dall'utente di seguito (per un'introduzione alle API Transform TensorFlow, vedere il tutorial ). Innanzitutto, definiamo alcune costanti per l'ingegneria delle caratteristiche:

_taxi_constants_module_file = 'taxi_constants.py'
%%writefile {_taxi_constants_module_file}

# Categorical features are assumed to each have a maximum value in the dataset.
MAX_CATEGORICAL_FEATURE_VALUES = [24, 31, 12]

CATEGORICAL_FEATURE_KEYS = [
    'trip_start_hour', 'trip_start_day', 'trip_start_month',
    'pickup_census_tract', 'dropoff_census_tract', 'pickup_community_area',
    'dropoff_community_area'
]

DENSE_FLOAT_FEATURE_KEYS = ['trip_miles', 'fare', 'trip_seconds']

# Number of buckets used by tf.transform for encoding each feature.
FEATURE_BUCKET_COUNT = 10

BUCKET_FEATURE_KEYS = [
    'pickup_latitude', 'pickup_longitude', 'dropoff_latitude',
    'dropoff_longitude'
]

# Number of vocabulary terms used for encoding VOCAB_FEATURES by tf.transform
VOCAB_SIZE = 1000

# Count of out-of-vocab buckets in which unrecognized VOCAB_FEATURES are hashed.
OOV_SIZE = 10

VOCAB_FEATURE_KEYS = [
    'payment_type',
    'company',
]

# Keys
LABEL_KEY = 'tips'
FARE_KEY = 'fare'

def transformed_name(key):
  return key + '_xf'
Writing taxi_constants.py

Successivamente, scriviamo un preprocessing_fn che accetta i dati grezzi come input e restituisce le funzionalità trasformate su cui il nostro modello può eseguire il training:

_taxi_transform_module_file = 'taxi_transform.py'
%%writefile {_taxi_transform_module_file}

import tensorflow as tf
import tensorflow_transform as tft

import taxi_constants

_DENSE_FLOAT_FEATURE_KEYS = taxi_constants.DENSE_FLOAT_FEATURE_KEYS
_VOCAB_FEATURE_KEYS = taxi_constants.VOCAB_FEATURE_KEYS
_VOCAB_SIZE = taxi_constants.VOCAB_SIZE
_OOV_SIZE = taxi_constants.OOV_SIZE
_FEATURE_BUCKET_COUNT = taxi_constants.FEATURE_BUCKET_COUNT
_BUCKET_FEATURE_KEYS = taxi_constants.BUCKET_FEATURE_KEYS
_CATEGORICAL_FEATURE_KEYS = taxi_constants.CATEGORICAL_FEATURE_KEYS
_FARE_KEY = taxi_constants.FARE_KEY
_LABEL_KEY = taxi_constants.LABEL_KEY
_transformed_name = taxi_constants.transformed_name


def preprocessing_fn(inputs):
  """tf.transform's callback function for preprocessing inputs.
  Args:
    inputs: map from feature keys to raw not-yet-transformed features.
  Returns:
    Map from string feature key to transformed feature operations.
  """
  outputs = {}
  for key in _DENSE_FLOAT_FEATURE_KEYS:
    # Preserve this feature as a dense float, setting nan's to the mean.
    outputs[_transformed_name(key)] = tft.scale_to_z_score(
        _fill_in_missing(inputs[key]))

  for key in _VOCAB_FEATURE_KEYS:
    # Build a vocabulary for this feature.
    outputs[_transformed_name(key)] = tft.compute_and_apply_vocabulary(
        _fill_in_missing(inputs[key]),
        top_k=_VOCAB_SIZE,
        num_oov_buckets=_OOV_SIZE)

  for key in _BUCKET_FEATURE_KEYS:
    outputs[_transformed_name(key)] = tft.bucketize(
        _fill_in_missing(inputs[key]), _FEATURE_BUCKET_COUNT)

  for key in _CATEGORICAL_FEATURE_KEYS:
    outputs[_transformed_name(key)] = _fill_in_missing(inputs[key])

  # Was this passenger a big tipper?
  taxi_fare = _fill_in_missing(inputs[_FARE_KEY])
  tips = _fill_in_missing(inputs[_LABEL_KEY])
  outputs[_transformed_name(_LABEL_KEY)] = tf.where(
      tf.math.is_nan(taxi_fare),
      tf.cast(tf.zeros_like(taxi_fare), tf.int64),
      # Test if the tip was > 20% of the fare.
      tf.cast(
          tf.greater(tips, tf.multiply(taxi_fare, tf.constant(0.2))), tf.int64))

  return outputs


def _fill_in_missing(x):
  """Replace missing values in a SparseTensor.
  Fills in missing values of `x` with '' or 0, and converts to a dense tensor.
  Args:
    x: A `SparseTensor` of rank 2.  Its dense shape should have size at most 1
      in the second dimension.
  Returns:
    A rank 1 tensor where missing values of `x` have been filled in.
  """
  if not isinstance(x, tf.sparse.SparseTensor):
    return x

  default_value = '' if x.dtype == tf.string else 0
  return tf.squeeze(
      tf.sparse.to_dense(
          tf.SparseTensor(x.indices, x.values, [x.dense_shape[0], 1]),
          default_value),
      axis=1)
Writing taxi_transform.py

A questo punto, passiamo questo codice di progettazione delle funzionalità al componente Transform e lo eseguiamo per trasformare i tuoi dati.

transform = Transform(
    examples=example_gen.outputs['examples'],
    schema=schema_gen.outputs['schema'],
    module_file=os.path.abspath(_taxi_transform_module_file))
context.run(transform)
INFO:absl:Running driver for Transform
INFO:absl:MetadataStore with DB connection initialized
INFO:absl:Running executor for Transform
INFO:absl:Analyze the 'train' split and transform all splits when splits_config is not set.
WARNING:absl:The default value of `force_tf_compat_v1` will change in a future release from `True` to `False`. Since this pipeline has TF 2 behaviors enabled, Transform will use native TF 2 at that point. You can test this behavior now by passing `force_tf_compat_v1=False` or disable it by explicitly setting `force_tf_compat_v1=True` in the Transform component.
INFO:absl:Loading source_path /tmpfs/src/temp/docs/tutorials/tfx/taxi_transform.py as name user_module_0 because it has not been loaded before.
INFO:absl:/tmpfs/src/temp/docs/tutorials/tfx/taxi_transform.py is already loaded, reloading
INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor.
WARNING:tensorflow:From /tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow_transform/tf_utils.py:266: Tensor.experimental_ref (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use ref() instead.
INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor.
WARNING:root:This output type hint will be ignored and not used for type-checking purposes. Typically, output type hints for a PTransform are single (or nested) types wrapped by a PCollection, PDone, or None. Got: Tuple[Dict[str, Union[NoneType, _Dataset]], Union[Dict[str, Dict[str, PCollection]], NoneType]] instead.
WARNING:root:This output type hint will be ignored and not used for type-checking purposes. Typically, output type hints for a PTransform are single (or nested) types wrapped by a PCollection, PDone, or None. Got: Tuple[Dict[str, Union[NoneType, _Dataset]], Union[Dict[str, Dict[str, PCollection]], NoneType]] instead.
WARNING:tensorflow:Tensorflow version (2.4.1) found. Note that Tensorflow Transform support for TF 2.0 is currently in beta, and features such as tf.function may not work as intended. 
WARNING:tensorflow:From /tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/saved_model/signature_def_utils_impl.py:201: build_tensor_info (from tensorflow.python.saved_model.utils_impl) is deprecated and will be removed in a future version.
Instructions for updating:
This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.utils.build_tensor_info or tf.compat.v1.saved_model.build_tensor_info.
INFO:tensorflow:Assets added to graph.
INFO:tensorflow:No assets to write.
WARNING:tensorflow:Issue encountered when serializing tft_mapper_use.
Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.
'Counter' object has no attribute 'name'
INFO:tensorflow:SavedModel written to: /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Transform/transform_graph/5/.temp_path/tftransform_tmp/2759bcbc74944ae8bd2eed4582b41bc1/saved_model.pb
INFO:tensorflow:Assets added to graph.
INFO:tensorflow:No assets to write.
WARNING:tensorflow:Issue encountered when serializing tft_mapper_use.
Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.
'Counter' object has no attribute 'name'
INFO:tensorflow:SavedModel written to: /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Transform/transform_graph/5/.temp_path/tftransform_tmp/b54b7e0c740b43fabb958e7415757157/saved_model.pb
INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor.
WARNING:tensorflow:Tensorflow version (2.4.1) found. Note that Tensorflow Transform support for TF 2.0 is currently in beta, and features such as tf.function may not work as intended.
INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor.
INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor.
WARNING:tensorflow:Tensorflow version (2.4.1) found. Note that Tensorflow Transform support for TF 2.0 is currently in beta, and features such as tf.function may not work as intended.
WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'>
INFO:tensorflow:Saver not created because there are no variables in the graph to restore
INFO:tensorflow:Saver not created because there are no variables in the graph to restore
INFO:tensorflow:Assets added to graph.
INFO:tensorflow:Assets written to: /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Transform/transform_graph/5/.temp_path/tftransform_tmp/bed9b4049e9a44abaf388abb10372f79/assets
INFO:tensorflow:SavedModel written to: /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Transform/transform_graph/5/.temp_path/tftransform_tmp/bed9b4049e9a44abaf388abb10372f79/saved_model.pb
WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_2:0\022-vocab_compute_and_apply_vocabulary_vocabulary"

WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_4:0\022/vocab_compute_and_apply_vocabulary_1_vocabulary"

INFO:tensorflow:Saver not created because there are no variables in the graph to restore
WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_2:0\022-vocab_compute_and_apply_vocabulary_vocabulary"

WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_4:0\022/vocab_compute_and_apply_vocabulary_1_vocabulary"

INFO:tensorflow:Saver not created because there are no variables in the graph to restore
WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_2:0\022-vocab_compute_and_apply_vocabulary_vocabulary"

WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_4:0\022/vocab_compute_and_apply_vocabulary_1_vocabulary"

INFO:tensorflow:Saver not created because there are no variables in the graph to restore
INFO:absl:Running publisher for Transform
INFO:absl:MetadataStore with DB connection initialized

Esaminiamo gli artefatti di output di Transform . Questo componente produce due tipi di output:

  • transform_graph è il grafico che può eseguire le operazioni di preelaborazione (questo grafico sarà incluso nei modelli di elaborazione e valutazione).
  • transformed_examples rappresenta i dati di addestramento e valutazione preelaborati.
transform.outputs
{'transform_graph': Channel(
    type_name: TransformGraph
    artifacts: [Artifact(artifact: id: 5
type_id: 13
uri: "/tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Transform/transform_graph/5"
custom_properties {
  key: "name"
  value {
    string_value: "transform_graph"
  }
}
custom_properties {
  key: "producer_component"
  value {
    string_value: "Transform"
  }
}
custom_properties {
  key: "state"
  value {
    string_value: "published"
  }
}
custom_properties {
  key: "tfx_version"
  value {
    string_value: "0.29.0"
  }
}
state: LIVE
, artifact_type: id: 13
name: "TransformGraph"
)]
    additional_properties: {}
    additional_custom_properties: {}
), 'transformed_examples': Channel(
    type_name: Examples
    artifacts: [Artifact(artifact: id: 6
type_id: 5
uri: "/tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Transform/transformed_examples/5"
properties {
  key: "split_names"
  value {
    string_value: "[\"train\", \"eval\"]"
  }
}
custom_properties {
  key: "name"
  value {
    string_value: "transformed_examples"
  }
}
custom_properties {
  key: "producer_component"
  value {
    string_value: "Transform"
  }
}
custom_properties {
  key: "state"
  value {
    string_value: "published"
  }
}
custom_properties {
  key: "tfx_version"
  value {
    string_value: "0.29.0"
  }
}
state: LIVE
, artifact_type: id: 5
name: "Examples"
properties {
  key: "span"
  value: INT
}
properties {
  key: "split_names"
  value: STRING
}
properties {
  key: "version"
  value: INT
}
)]
    additional_properties: {}
    additional_custom_properties: {}
), 'updated_analyzer_cache': Channel(
    type_name: TransformCache
    artifacts: [Artifact(artifact: id: 7
type_id: 14
uri: "/tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Transform/updated_analyzer_cache/5"
custom_properties {
  key: "name"
  value {
    string_value: "updated_analyzer_cache"
  }
}
custom_properties {
  key: "producer_component"
  value {
    string_value: "Transform"
  }
}
custom_properties {
  key: "state"
  value {
    string_value: "published"
  }
}
custom_properties {
  key: "tfx_version"
  value {
    string_value: "0.29.0"
  }
}
state: LIVE
, artifact_type: id: 14
name: "TransformCache"
)]
    additional_properties: {}
    additional_custom_properties: {}
)}

Dai un'occhiata all'artefatto transform_graph . Punta a una directory contenente tre sottodirectory.

train_uri = transform.outputs['transform_graph'].get()[0].uri
os.listdir(train_uri)
['transform_fn', 'transformed_metadata', 'metadata']

La sottodirectory transformed_metadata contiene lo schema dei dati preelaborati. La sottodirectory transform_fn contiene il grafico di preelaborazione effettivo. La sottodirectory dei metadata contiene lo schema dei dati originali.

Possiamo anche dare un'occhiata ai primi tre esempi trasformati:

# Get the URI of the output artifact representing the transformed examples, which is a directory
train_uri = os.path.join(transform.outputs['transformed_examples'].get()[0].uri, 'Split-train')

# Get the list of files in this directory (all compressed TFRecord files)
tfrecord_filenames = [os.path.join(train_uri, name)
                      for name in os.listdir(train_uri)]

# Create a `TFRecordDataset` to read these files
dataset = tf.data.TFRecordDataset(tfrecord_filenames, compression_type="GZIP")

# Iterate over the first 3 records and decode them.
for tfrecord in dataset.take(3):
  serialized_example = tfrecord.numpy()
  example = tf.train.Example()
  example.ParseFromString(serialized_example)
  pp.pprint(example)
features {
  feature {
    key: "company_xf"
    value {
      int64_list {
        value: 8
      }
    }
  }
  feature {
    key: "dropoff_census_tract_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "dropoff_community_area_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "dropoff_latitude_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "dropoff_longitude_xf"
    value {
      int64_list {
        value: 9
      }
    }
  }
  feature {
    key: "fare_xf"
    value {
      float_list {
        value: 0.061060599982738495
      }
    }
  }
  feature {
    key: "payment_type_xf"
    value {
      int64_list {
        value: 1
      }
    }
  }
  feature {
    key: "pickup_census_tract_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "pickup_community_area_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "pickup_latitude_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "pickup_longitude_xf"
    value {
      int64_list {
        value: 9
      }
    }
  }
  feature {
    key: "tips_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "trip_miles_xf"
    value {
      float_list {
        value: -0.15886740386486053
      }
    }
  }
  feature {
    key: "trip_seconds_xf"
    value {
      float_list {
        value: -0.7118487358093262
      }
    }
  }
  feature {
    key: "trip_start_day_xf"
    value {
      int64_list {
        value: 6
      }
    }
  }
  feature {
    key: "trip_start_hour_xf"
    value {
      int64_list {
        value: 19
      }
    }
  }
  feature {
    key: "trip_start_month_xf"
    value {
      int64_list {
        value: 5
      }
    }
  }
}

features {
  feature {
    key: "company_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "dropoff_census_tract_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "dropoff_community_area_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "dropoff_latitude_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "dropoff_longitude_xf"
    value {
      int64_list {
        value: 9
      }
    }
  }
  feature {
    key: "fare_xf"
    value {
      float_list {
        value: 1.2521240711212158
      }
    }
  }
  feature {
    key: "payment_type_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "pickup_census_tract_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "pickup_community_area_xf"
    value {
      int64_list {
        value: 60
      }
    }
  }
  feature {
    key: "pickup_latitude_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "pickup_longitude_xf"
    value {
      int64_list {
        value: 3
      }
    }
  }
  feature {
    key: "tips_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "trip_miles_xf"
    value {
      float_list {
        value: 0.532160758972168
      }
    }
  }
  feature {
    key: "trip_seconds_xf"
    value {
      float_list {
        value: 0.5509493350982666
      }
    }
  }
  feature {
    key: "trip_start_day_xf"
    value {
      int64_list {
        value: 3
      }
    }
  }
  feature {
    key: "trip_start_hour_xf"
    value {
      int64_list {
        value: 2
      }
    }
  }
  feature {
    key: "trip_start_month_xf"
    value {
      int64_list {
        value: 10
      }
    }
  }
}

features {
  feature {
    key: "company_xf"
    value {
      int64_list {
        value: 48
      }
    }
  }
  feature {
    key: "dropoff_census_tract_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "dropoff_community_area_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "dropoff_latitude_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "dropoff_longitude_xf"
    value {
      int64_list {
        value: 9
      }
    }
  }
  feature {
    key: "fare_xf"
    value {
      float_list {
        value: 0.3873794376850128
      }
    }
  }
  feature {
    key: "payment_type_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "pickup_census_tract_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "pickup_community_area_xf"
    value {
      int64_list {
        value: 13
      }
    }
  }
  feature {
    key: "pickup_latitude_xf"
    value {
      int64_list {
        value: 9
      }
    }
  }
  feature {
    key: "pickup_longitude_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "tips_xf"
    value {
      int64_list {
        value: 0
      }
    }
  }
  feature {
    key: "trip_miles_xf"
    value {
      float_list {
        value: 0.21955278515815735
      }
    }
  }
  feature {
    key: "trip_seconds_xf"
    value {
      float_list {
        value: 0.0019067145185545087
      }
    }
  }
  feature {
    key: "trip_start_day_xf"
    value {
      int64_list {
        value: 3
      }
    }
  }
  feature {
    key: "trip_start_hour_xf"
    value {
      int64_list {
        value: 12
      }
    }
  }
  feature {
    key: "trip_start_month_xf"
    value {
      int64_list {
        value: 11
      }
    }
  }
}

Dopo che il componente Transform ha trasformato i dati in funzionalità, il passaggio successivo consiste nell'addestrare un modello.

Trainer

Il componente Trainer addestrerà un modello definito in TensorFlow. Default Trainer support Estimator API, per utilizzare Keras API, è necessario specificare Generic Trainer custom_executor_spec=executor_spec.ExecutorClassSpec(GenericExecutor) nel custom_executor_spec=executor_spec.ExecutorClassSpec(GenericExecutor) di Trainer.

Trainer prende come input lo schema da SchemaGen , i dati trasformati e il grafico da Transform , i parametri di addestramento, nonché un modulo che contiene il codice del modello definito dall'utente.

Vediamo di seguito un esempio di codice modello definito dall'utente (per un'introduzione alle API TensorFlow Keras, vedere il tutorial ):

_taxi_trainer_module_file = 'taxi_trainer.py'
%%writefile {_taxi_trainer_module_file}

from typing import List, Text

import os
import absl
import datetime
import tensorflow as tf
import tensorflow_transform as tft

from tfx.components.trainer.executor import TrainerFnArgs
from tfx.components.trainer.fn_args_utils import DataAccessor
from tfx_bsl.tfxio import dataset_options

import taxi_constants

_DENSE_FLOAT_FEATURE_KEYS = taxi_constants.DENSE_FLOAT_FEATURE_KEYS
_VOCAB_FEATURE_KEYS = taxi_constants.VOCAB_FEATURE_KEYS
_VOCAB_SIZE = taxi_constants.VOCAB_SIZE
_OOV_SIZE = taxi_constants.OOV_SIZE
_FEATURE_BUCKET_COUNT = taxi_constants.FEATURE_BUCKET_COUNT
_BUCKET_FEATURE_KEYS = taxi_constants.BUCKET_FEATURE_KEYS
_CATEGORICAL_FEATURE_KEYS = taxi_constants.CATEGORICAL_FEATURE_KEYS
_MAX_CATEGORICAL_FEATURE_VALUES = taxi_constants.MAX_CATEGORICAL_FEATURE_VALUES
_LABEL_KEY = taxi_constants.LABEL_KEY
_transformed_name = taxi_constants.transformed_name


def _transformed_names(keys):
  return [_transformed_name(key) for key in keys]


def _get_serve_tf_examples_fn(model, tf_transform_output):
  """Returns a function that parses a serialized tf.Example and applies TFT."""

  model.tft_layer = tf_transform_output.transform_features_layer()

  @tf.function
  def serve_tf_examples_fn(serialized_tf_examples):
    """Returns the output to be used in the serving signature."""
    feature_spec = tf_transform_output.raw_feature_spec()
    feature_spec.pop(_LABEL_KEY)
    parsed_features = tf.io.parse_example(serialized_tf_examples, feature_spec)
    transformed_features = model.tft_layer(parsed_features)
    return model(transformed_features)

  return serve_tf_examples_fn


def _input_fn(file_pattern: List[Text],
              data_accessor: DataAccessor,
              tf_transform_output: tft.TFTransformOutput,
              batch_size: int = 200) -> tf.data.Dataset:
  """Generates features and label for tuning/training.

  Args:
    file_pattern: List of paths or patterns of input tfrecord files.
    data_accessor: DataAccessor for converting input to RecordBatch.
    tf_transform_output: A TFTransformOutput.
    batch_size: representing the number of consecutive elements of returned
      dataset to combine in a single batch

  Returns:
    A dataset that contains (features, indices) tuple where features is a
      dictionary of Tensors, and indices is a single Tensor of label indices.
  """
  return data_accessor.tf_dataset_factory(
      file_pattern,
      dataset_options.TensorFlowDatasetOptions(
          batch_size=batch_size, label_key=_transformed_name(_LABEL_KEY)),
      tf_transform_output.transformed_metadata.schema)


def _build_keras_model(hidden_units: List[int] = None) -> tf.keras.Model:
  """Creates a DNN Keras model for classifying taxi data.

  Args:
    hidden_units: [int], the layer sizes of the DNN (input layer first).

  Returns:
    A keras Model.
  """
  real_valued_columns = [
      tf.feature_column.numeric_column(key, shape=())
      for key in _transformed_names(_DENSE_FLOAT_FEATURE_KEYS)
  ]
  categorical_columns = [
      tf.feature_column.categorical_column_with_identity(
          key, num_buckets=_VOCAB_SIZE + _OOV_SIZE, default_value=0)
      for key in _transformed_names(_VOCAB_FEATURE_KEYS)
  ]
  categorical_columns += [
      tf.feature_column.categorical_column_with_identity(
          key, num_buckets=_FEATURE_BUCKET_COUNT, default_value=0)
      for key in _transformed_names(_BUCKET_FEATURE_KEYS)
  ]
  categorical_columns += [
      tf.feature_column.categorical_column_with_identity(  # pylint: disable=g-complex-comprehension
          key,
          num_buckets=num_buckets,
          default_value=0) for key, num_buckets in zip(
              _transformed_names(_CATEGORICAL_FEATURE_KEYS),
              _MAX_CATEGORICAL_FEATURE_VALUES)
  ]
  indicator_column = [
      tf.feature_column.indicator_column(categorical_column)
      for categorical_column in categorical_columns
  ]

  model = _wide_and_deep_classifier(
      # TODO(b/139668410) replace with premade wide_and_deep keras model
      wide_columns=indicator_column,
      deep_columns=real_valued_columns,
      dnn_hidden_units=hidden_units or [100, 70, 50, 25])
  return model


def _wide_and_deep_classifier(wide_columns, deep_columns, dnn_hidden_units):
  """Build a simple keras wide and deep model.

  Args:
    wide_columns: Feature columns wrapped in indicator_column for wide (linear)
      part of the model.
    deep_columns: Feature columns for deep part of the model.
    dnn_hidden_units: [int], the layer sizes of the hidden DNN.

  Returns:
    A Wide and Deep Keras model
  """
  # Following values are hard coded for simplicity in this example,
  # However prefarably they should be passsed in as hparams.

  # Keras needs the feature definitions at compile time.
  # TODO(b/139081439): Automate generation of input layers from FeatureColumn.
  input_layers = {
      colname: tf.keras.layers.Input(name=colname, shape=(), dtype=tf.float32)
      for colname in _transformed_names(_DENSE_FLOAT_FEATURE_KEYS)
  }
  input_layers.update({
      colname: tf.keras.layers.Input(name=colname, shape=(), dtype='int32')
      for colname in _transformed_names(_VOCAB_FEATURE_KEYS)
  })
  input_layers.update({
      colname: tf.keras.layers.Input(name=colname, shape=(), dtype='int32')
      for colname in _transformed_names(_BUCKET_FEATURE_KEYS)
  })
  input_layers.update({
      colname: tf.keras.layers.Input(name=colname, shape=(), dtype='int32')
      for colname in _transformed_names(_CATEGORICAL_FEATURE_KEYS)
  })

  # TODO(b/161952382): Replace with Keras preprocessing layers.
  deep = tf.keras.layers.DenseFeatures(deep_columns)(input_layers)
  for numnodes in dnn_hidden_units:
    deep = tf.keras.layers.Dense(numnodes)(deep)
  wide = tf.keras.layers.DenseFeatures(wide_columns)(input_layers)

  output = tf.keras.layers.Dense(
      1, activation='sigmoid')(
          tf.keras.layers.concatenate([deep, wide]))

  model = tf.keras.Model(input_layers, output)
  model.compile(
      loss='binary_crossentropy',
      optimizer=tf.keras.optimizers.Adam(lr=0.001),
      metrics=[tf.keras.metrics.BinaryAccuracy()])
  model.summary(print_fn=absl.logging.info)
  return model


# TFX Trainer will call this function.
def run_fn(fn_args: TrainerFnArgs):
  """Train the model based on given args.

  Args:
    fn_args: Holds args used to train the model as name/value pairs.
  """
  # Number of nodes in the first layer of the DNN
  first_dnn_layer_size = 100
  num_dnn_layers = 4
  dnn_decay_factor = 0.7

  tf_transform_output = tft.TFTransformOutput(fn_args.transform_output)

  train_dataset = _input_fn(fn_args.train_files, fn_args.data_accessor, 
                            tf_transform_output, 40)
  eval_dataset = _input_fn(fn_args.eval_files, fn_args.data_accessor, 
                           tf_transform_output, 40)

  model = _build_keras_model(
      # Construct layers sizes with exponetial decay
      hidden_units=[
          max(2, int(first_dnn_layer_size * dnn_decay_factor**i))
          for i in range(num_dnn_layers)
      ])

  tensorboard_callback = tf.keras.callbacks.TensorBoard(
      log_dir=fn_args.model_run_dir, update_freq='batch')
  model.fit(
      train_dataset,
      steps_per_epoch=fn_args.train_steps,
      validation_data=eval_dataset,
      validation_steps=fn_args.eval_steps,
      callbacks=[tensorboard_callback])

  signatures = {
      'serving_default':
          _get_serve_tf_examples_fn(model,
                                    tf_transform_output).get_concrete_function(
                                        tf.TensorSpec(
                                            shape=[None],
                                            dtype=tf.string,
                                            name='examples')),
  }
  model.save(fn_args.serving_model_dir, save_format='tf', signatures=signatures)
Writing taxi_trainer.py

Ora, passiamo questo codice modello al componente Trainer e lo eseguiamo per addestrare il modello.

trainer = Trainer(
    module_file=os.path.abspath(_taxi_trainer_module_file),
    custom_executor_spec=executor_spec.ExecutorClassSpec(GenericExecutor),
    examples=transform.outputs['transformed_examples'],
    transform_graph=transform.outputs['transform_graph'],
    schema=schema_gen.outputs['schema'],
    train_args=trainer_pb2.TrainArgs(num_steps=10000),
    eval_args=trainer_pb2.EvalArgs(num_steps=5000))
context.run(trainer)
WARNING:absl:From <ipython-input-1-47ec22ebd3e4>:3: The name tfx.components.base.executor_spec.ExecutorClassSpec is deprecated. Please use tfx.dsl.components.base.executor_spec.ExecutorClassSpec instead.
INFO:absl:Running driver for Trainer
INFO:absl:MetadataStore with DB connection initialized
INFO:absl:Running executor for Trainer
INFO:absl:Train on the 'train' split when train_args.splits is not set.
INFO:absl:Evaluate on the 'eval' split when eval_args.splits is not set.
WARNING:absl:Examples artifact does not have payload_format custom property. Falling back to FORMAT_TF_EXAMPLE
WARNING:absl:Examples artifact does not have payload_format custom property. Falling back to FORMAT_TF_EXAMPLE
WARNING:absl:Examples artifact does not have payload_format custom property. Falling back to FORMAT_TF_EXAMPLE
INFO:absl:Loading source_path /tmpfs/src/temp/docs/tutorials/tfx/taxi_trainer.py as name user_module_1 because it has not been loaded before.
INFO:absl:Training model.
INFO:absl:Feature company_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_census_tract_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_community_area_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_latitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_longitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature fare_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature payment_type_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_census_tract_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_community_area_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_latitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_longitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature tips_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_miles_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_seconds_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_start_day_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_start_hour_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_start_month_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature company_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_census_tract_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_community_area_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_latitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_longitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature fare_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature payment_type_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_census_tract_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_community_area_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_latitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_longitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature tips_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_miles_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_seconds_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_start_day_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_start_hour_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_start_month_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature company_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_census_tract_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_community_area_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_latitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_longitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature fare_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature payment_type_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_census_tract_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_community_area_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_latitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_longitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature tips_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_miles_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_seconds_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_start_day_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_start_hour_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_start_month_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature company_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_census_tract_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_community_area_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_latitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature dropoff_longitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature fare_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature payment_type_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_census_tract_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_community_area_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_latitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature pickup_longitude_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature tips_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_miles_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_seconds_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_start_day_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_start_hour_xf has a shape . Setting to DenseTensor.
INFO:absl:Feature trip_start_month_xf has a shape . Setting to DenseTensor.
INFO:absl:Model: "model"
INFO:absl:__________________________________________________________________________________________________
INFO:absl:Layer (type)                    Output Shape         Param #     Connected to                     
INFO:absl:==================================================================================================
INFO:absl:company_xf (InputLayer)         [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:dropoff_census_tract_xf (InputL [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:dropoff_community_area_xf (Inpu [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:dropoff_latitude_xf (InputLayer [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:dropoff_longitude_xf (InputLaye [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:fare_xf (InputLayer)            [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:payment_type_xf (InputLayer)    [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:pickup_census_tract_xf (InputLa [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:pickup_community_area_xf (Input [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:pickup_latitude_xf (InputLayer) [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:pickup_longitude_xf (InputLayer [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:trip_miles_xf (InputLayer)      [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:trip_seconds_xf (InputLayer)    [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:trip_start_day_xf (InputLayer)  [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:trip_start_hour_xf (InputLayer) [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:trip_start_month_xf (InputLayer [(None,)]            0                                            
INFO:absl:__________________________________________________________________________________________________
INFO:absl:dense_features (DenseFeatures)  (None, 3)            0           company_xf[0][0]                 
INFO:absl:                                                                 dropoff_census_tract_xf[0][0]    
INFO:absl:                                                                 dropoff_community_area_xf[0][0]  
INFO:absl:                                                                 dropoff_latitude_xf[0][0]        
INFO:absl:                                                                 dropoff_longitude_xf[0][0]       
INFO:absl:                                                                 fare_xf[0][0]                    
INFO:absl:                                                                 payment_type_xf[0][0]            
INFO:absl:                                                                 pickup_census_tract_xf[0][0]     
INFO:absl:                                                                 pickup_community_area_xf[0][0]   
INFO:absl:                                                                 pickup_latitude_xf[0][0]         
INFO:absl:                                                                 pickup_longitude_xf[0][0]        
INFO:absl:                                                                 trip_miles_xf[0][0]              
INFO:absl:                                                                 trip_seconds_xf[0][0]            
INFO:absl:                                                                 trip_start_day_xf[0][0]          
INFO:absl:                                                                 trip_start_hour_xf[0][0]         
INFO:absl:                                                                 trip_start_month_xf[0][0]        
INFO:absl:__________________________________________________________________________________________________
INFO:absl:dense (Dense)                   (None, 100)          400         dense_features[0][0]             
INFO:absl:__________________________________________________________________________________________________
INFO:absl:dense_1 (Dense)                 (None, 70)           7070        dense[0][0]                      
INFO:absl:__________________________________________________________________________________________________
INFO:absl:dense_2 (Dense)                 (None, 48)           3408        dense_1[0][0]                    
INFO:absl:__________________________________________________________________________________________________
INFO:absl:dense_3 (Dense)                 (None, 34)           1666        dense_2[0][0]                    
INFO:absl:__________________________________________________________________________________________________
INFO:absl:dense_features_1 (DenseFeatures (None, 2127)         0           company_xf[0][0]                 
INFO:absl:                                                                 dropoff_census_tract_xf[0][0]    
INFO:absl:                                                                 dropoff_community_area_xf[0][0]  
INFO:absl:                                                                 dropoff_latitude_xf[0][0]        
INFO:absl:                                                                 dropoff_longitude_xf[0][0]       
INFO:absl:                                                                 fare_xf[0][0]                    
INFO:absl:                                                                 payment_type_xf[0][0]            
INFO:absl:                                                                 pickup_census_tract_xf[0][0]     
INFO:absl:                                                                 pickup_community_area_xf[0][0]   
INFO:absl:                                                                 pickup_latitude_xf[0][0]         
INFO:absl:                                                                 pickup_longitude_xf[0][0]        
INFO:absl:                                                                 trip_miles_xf[0][0]              
INFO:absl:                                                                 trip_seconds_xf[0][0]            
INFO:absl:                                                                 trip_start_day_xf[0][0]          
INFO:absl:                                                                 trip_start_hour_xf[0][0]         
INFO:absl:                                                                 trip_start_month_xf[0][0]        
INFO:absl:__________________________________________________________________________________________________
INFO:absl:concatenate (Concatenate)       (None, 2161)         0           dense_3[0][0]                    
INFO:absl:                                                                 dense_features_1[0][0]           
INFO:absl:__________________________________________________________________________________________________
INFO:absl:dense_4 (Dense)                 (None, 1)            2162        concatenate[0][0]                
INFO:absl:==================================================================================================
INFO:absl:Total params: 14,706
INFO:absl:Trainable params: 14,706
INFO:absl:Non-trainable params: 0
INFO:absl:__________________________________________________________________________________________________
10000/10000 [==============================] - 84s 8ms/step - loss: 0.3103 - binary_accuracy: 0.8464 - val_loss: 0.2227 - val_binary_accuracy: 0.8831
INFO:tensorflow:Saver not created because there are no variables in the graph to restore
INFO:tensorflow:Assets written to: /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Trainer/model/6/Format-Serving/assets
INFO:absl:Training complete. Model written to /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Trainer/model/6/Format-Serving. ModelRun written to /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Trainer/model_run/6
INFO:absl:Running publisher for Trainer
INFO:absl:MetadataStore with DB connection initialized

Analizza l'allenamento con TensorBoard

Dai un'occhiata al manufatto dell'allenatore. Punta a una directory contenente le sottodirectory del modello.

model_artifact_dir = trainer.outputs['model'].get()[0].uri
pp.pprint(os.listdir(model_artifact_dir))
model_dir = os.path.join(model_artifact_dir, 'Format-Serving')
pp.pprint(os.listdir(model_dir))
['Format-Serving']
['variables', 'assets', 'saved_model.pb']

Facoltativamente, possiamo collegare TensorBoard al Trainer per analizzare le curve di addestramento del nostro modello.

model_run_artifact_dir = trainer.outputs['model_run'].get()[0].uri

%load_ext tensorboard
%tensorboard --logdir {model_run_artifact_dir}

Valutatore

Il componente Evaluator calcola le metriche delle prestazioni del modello sul set di valutazione. Utilizza la libreria TensorFlow Model Analysis . Il Evaluator può anche convalidare facoltativamente che un modello appena addestrato è migliore del modello precedente. Ciò è utile in un'impostazione di pipeline di produzione in cui è possibile addestrare e convalidare automaticamente un modello ogni giorno. In questo notebook, addestriamo solo un modello, quindi l' Evaluator etichetterà automaticamente il modello come "buono".

Evaluator prenderà come input i dati da ExampleGen , il modello addestrato da Trainer e la configurazione dello slicing. La configurazione dello slicing ti consente di suddividere le metriche sui valori delle caratteristiche (ad esempio, come si comporta il tuo modello nei viaggi in taxi che iniziano alle 8:00 rispetto alle 20:00?). Vedere un esempio di questa configurazione di seguito:

eval_config = tfma.EvalConfig(
    model_specs=[
        # This assumes a serving model with signature 'serving_default'. If
        # using estimator based EvalSavedModel, add signature_name: 'eval' and 
        # remove the label_key.
        tfma.ModelSpec(label_key='tips')
    ],
    metrics_specs=[
        tfma.MetricsSpec(
            # The metrics added here are in addition to those saved with the
            # model (assuming either a keras model or EvalSavedModel is used).
            # Any metrics added into the saved model (for example using
            # model.compile(..., metrics=[...]), etc) will be computed
            # automatically.
            # To add validation thresholds for metrics saved with the model,
            # add them keyed by metric name to the thresholds map.
            metrics=[
                tfma.MetricConfig(class_name='ExampleCount'),
                tfma.MetricConfig(class_name='BinaryAccuracy',
                  threshold=tfma.MetricThreshold(
                      value_threshold=tfma.GenericValueThreshold(
                          lower_bound={'value': 0.5}),
                      # Change threshold will be ignored if there is no
                      # baseline model resolved from MLMD (first run).
                      change_threshold=tfma.GenericChangeThreshold(
                          direction=tfma.MetricDirection.HIGHER_IS_BETTER,
                          absolute={'value': -1e-10})))
            ]
        )
    ],
    slicing_specs=[
        # An empty slice spec means the overall slice, i.e. the whole dataset.
        tfma.SlicingSpec(),
        # Data can be sliced along a feature column. In this case, data is
        # sliced along feature column trip_start_hour.
        tfma.SlicingSpec(feature_keys=['trip_start_hour'])
    ])

Successivamente, diamo questa configurazione a Evaluator e la eseguiamo.

# Use TFMA to compute a evaluation statistics over features of a model and
# validate them against a baseline.

# The model resolver is only required if performing model validation in addition
# to evaluation. In this case we validate against the latest blessed model. If
# no model has been blessed before (as in this case) the evaluator will make our
# candidate the first blessed model.
model_resolver = resolver.Resolver(
      instance_name='latest_blessed_model_resolver',
      strategy_class=latest_blessed_model_resolver.LatestBlessedModelResolver,
      model=Channel(type=Model),
      model_blessing=Channel(type=ModelBlessing))
context.run(model_resolver)

evaluator = Evaluator(
    examples=example_gen.outputs['examples'],
    model=trainer.outputs['model'],
    baseline_model=model_resolver.outputs['model'],
    eval_config=eval_config)
context.run(evaluator)
WARNING:absl:`instance_name` is deprecated, please set the node id directly using `with_id()` or the `.id` setter.
INFO:absl:Running driver for Resolver.latest_blessed_model_resolver
INFO:absl:MetadataStore with DB connection initialized
INFO:absl:Running publisher for Resolver.latest_blessed_model_resolver
INFO:absl:MetadataStore with DB connection initialized
INFO:absl:Running driver for Evaluator
INFO:absl:MetadataStore with DB connection initialized
INFO:absl:Running executor for Evaluator
ERROR:absl:There are change thresholds, but the baseline is missing. This is allowed only when rubber stamping (first run).
INFO:absl:Request was made to ignore the baseline ModelSpec and any change thresholds. This is likely because a baseline model was not provided: updated_config=
model_specs {
  label_key: "tips"
}
slicing_specs {
}
slicing_specs {
  feature_keys: "trip_start_hour"
}
metrics_specs {
  metrics {
    class_name: "ExampleCount"
  }
  metrics {
    class_name: "BinaryAccuracy"
    threshold {
      value_threshold {
        lower_bound {
          value: 0.5
        }
      }
    }
  }
}

INFO:absl:Using /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Trainer/model/6/Format-Serving as  model.
WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program.

Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7fd3bc05a9b0> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7fd3bf50a8d0>).
INFO:absl:The 'example_splits' parameter is not set, using 'eval' split.
INFO:absl:Evaluating model.
INFO:absl:Request was made to ignore the baseline ModelSpec and any change thresholds. This is likely because a baseline model was not provided: updated_config=
model_specs {
  label_key: "tips"
}
slicing_specs {
}
slicing_specs {
  feature_keys: "trip_start_hour"
}
metrics_specs {
  metrics {
    class_name: "ExampleCount"
  }
  metrics {
    class_name: "BinaryAccuracy"
    threshold {
      value_threshold {
        lower_bound {
          value: 0.5
        }
      }
    }
  }
  model_names: ""
}

INFO:absl:Request was made to ignore the baseline ModelSpec and any change thresholds. This is likely because a baseline model was not provided: updated_config=
model_specs {
  label_key: "tips"
}
slicing_specs {
}
slicing_specs {
  feature_keys: "trip_start_hour"
}
metrics_specs {
  metrics {
    class_name: "ExampleCount"
  }
  metrics {
    class_name: "BinaryAccuracy"
    threshold {
      value_threshold {
        lower_bound {
          value: 0.5
        }
      }
    }
  }
  model_names: ""
}

INFO:absl:Request was made to ignore the baseline ModelSpec and any change thresholds. This is likely because a baseline model was not provided: updated_config=
model_specs {
  label_key: "tips"
}
slicing_specs {
}
slicing_specs {
  feature_keys: "trip_start_hour"
}
metrics_specs {
  metrics {
    class_name: "ExampleCount"
  }
  metrics {
    class_name: "BinaryAccuracy"
    threshold {
      value_threshold {
        lower_bound {
          value: 0.5
        }
      }
    }
  }
  model_names: ""
}
WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program.

Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7fd33c3bf588> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7fd33c3e6a20>).
Exception ignored in: <bound method CapturableResourceDeleter.__del__ of <tensorflow.python.training.tracking.tracking.CapturableResourceDeleter object at 0x7fd3bf2dce10>>
Traceback (most recent call last):
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/training/tracking/tracking.py", line 208, in __del__
    self._destroy_resource()
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/eager/def_function.py", line 828, in __call__
    result = self._call(*args, **kwds)
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/eager/def_function.py", line 871, in _call
    self._initialize(args, kwds, add_initializers_to=initializers)
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/eager/def_function.py", line 726, in _initialize
    *args, **kwds))
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/eager/function.py", line 2969, in _get_concrete_function_internal_garbage_collected
    graph_function, _ = self._maybe_define_function(args, kwargs)
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/eager/function.py", line 3361, in _maybe_define_function
    graph_function = self._create_graph_function(args, kwargs)
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/eager/function.py", line 3206, in _create_graph_function
    capture_by_value=self._capture_by_value),
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/framework/func_graph.py", line 990, in func_graph_from_py_func
    func_outputs = python_func(*func_args, **func_kwargs)
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/eager/def_function.py", line 634, in wrapped_fn
    out = weak_wrapped_fn().__wrapped__(*args, **kwds)
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/saved_model/function_deserialization.py", line 253, in restored_function_body
    return _call_concrete_function(function, inputs)
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/saved_model/function_deserialization.py", line 75, in _call_concrete_function
    result = function._call_flat(tensor_inputs, function._captured_inputs)  # pylint: disable=protected-access
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/saved_model/load.py", line 116, in _call_flat
    cancellation_manager)
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/eager/function.py", line 1932, in _call_flat
    flat_outputs = forward_function.call(ctx, args_with_tangents)
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/eager/function.py", line 589, in call
    executor_type=executor_type)
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/ops/functional_ops.py", line 1206, in partitioned_call
    f.add_to_graph(graph)
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/eager/function.py", line 505, in add_to_graph
    g._add_function(self)
  File "/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3396, in _add_function
    gradient)
tensorflow.python.framework.errors_impl.InvalidArgumentError: 'func' argument to TF_GraphCopyFunction cannot be null
WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program.

Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7fd2de7f3940> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7fd2df83ca20>).
WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program.

Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7fd2c66ccd30> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7fd2c66a90f0>).
WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program.

Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7fceb5d73390> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7fceb5d30198>).
WARNING:tensorflow:5 out of the last 5 calls to <function recreate_function.<locals>.restored_function_body at 0x7fd390044378> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for  more details.
WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program.

Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7fd3bf492518> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7fd42488d828>).
WARNING:tensorflow:6 out of the last 6 calls to <function recreate_function.<locals>.restored_function_body at 0x7fceb53aa488> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for  more details.
WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program.

Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7fceb52a62e8> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7fceb5223e48>).
WARNING:tensorflow:7 out of the last 7 calls to <function recreate_function.<locals>.restored_function_body at 0x7fceb48a00d0> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for  more details.
WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program.

Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7fceb46de588> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7fceb472a780>).
WARNING:tensorflow:8 out of the last 8 calls to <function recreate_function.<locals>.restored_function_body at 0x7fce9feb49d8> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for  more details.
INFO:absl:Evaluation complete. Results written to /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Evaluator/evaluation/8.
INFO:absl:Checking validation results.
WARNING:tensorflow:From /tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow_model_analysis/writers/metrics_plots_and_validations_writer.py:113: tf_record_iterator (from tensorflow.python.lib.io.tf_record) is deprecated and will be removed in a future version.
Instructions for updating:
Use eager execution and: 
`tf.data.TFRecordDataset(path)`
INFO:absl:Blessing result True written to /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Evaluator/blessing/8.
INFO:absl:Running publisher for Evaluator
INFO:absl:MetadataStore with DB connection initialized

Ora esaminiamo gli artefatti di output di Evaluator .

evaluator.outputs
{'evaluation': Channel(
    type_name: ModelEvaluation
    artifacts: [Artifact(artifact: id: 10
type_id: 20
uri: "/tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Evaluator/evaluation/8"
custom_properties {
  key: "name"
  value {
    string_value: "evaluation"
  }
}
custom_properties {
  key: "producer_component"
  value {
    string_value: "Evaluator"
  }
}
custom_properties {
  key: "state"
  value {
    string_value: "published"
  }
}
custom_properties {
  key: "tfx_version"
  value {
    string_value: "0.29.0"
  }
}
state: LIVE
, artifact_type: id: 20
name: "ModelEvaluation"
)]
    additional_properties: {}
    additional_custom_properties: {}
), 'blessing': Channel(
    type_name: ModelBlessing
    artifacts: [Artifact(artifact: id: 11
type_id: 21
uri: "/tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Evaluator/blessing/8"
custom_properties {
  key: "blessed"
  value {
    int_value: 1
  }
}
custom_properties {
  key: "current_model"
  value {
    string_value: "/tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Trainer/model/6"
  }
}
custom_properties {
  key: "current_model_id"
  value {
    int_value: 8
  }
}
custom_properties {
  key: "name"
  value {
    string_value: "blessing"
  }
}
custom_properties {
  key: "producer_component"
  value {
    string_value: "Evaluator"
  }
}
custom_properties {
  key: "state"
  value {
    string_value: "published"
  }
}
custom_properties {
  key: "tfx_version"
  value {
    string_value: "0.29.0"
  }
}
state: LIVE
, artifact_type: id: 21
name: "ModelBlessing"
)]
    additional_properties: {}
    additional_custom_properties: {}
)}

Utilizzando l'output della evaluation possiamo mostrare la visualizzazione predefinita delle metriche globali sull'intero set di valutazione.

context.show(evaluator.outputs['evaluation'])

Per vedere la visualizzazione per le metriche di valutazione suddivise, possiamo chiamare direttamente la libreria TensorFlow Model Analysis.

import tensorflow_model_analysis as tfma

# Get the TFMA output result path and load the result.
PATH_TO_RESULT = evaluator.outputs['evaluation'].get()[0].uri
tfma_result = tfma.load_eval_result(PATH_TO_RESULT)

# Show data sliced along feature column trip_start_hour.
tfma.view.render_slicing_metrics(
    tfma_result, slicing_column='trip_start_hour')
SlicingMetricsViewer(config={'weightedExamplesColumn': 'example_count'}, data=[{'slice': 'trip_start_hour:19',…

Questa visualizzazione mostra le stesse metriche, ma calcolate per ogni valore di caratteristica di trip_start_hour invece che sull'intero set di valutazione.

TensorFlow Model Analysis supporta molte altre visualizzazioni, come gli indicatori di equità e il tracciamento di una serie temporale delle prestazioni del modello. Per saperne di più, guarda il tutorial .

Poiché abbiamo aggiunto soglie alla nostra configurazione, è disponibile anche l'output di convalida. La presenza di un artefatto di blessing indica che il nostro modello ha superato la convalida. Poiché questa è la prima convalida eseguita, il candidato viene automaticamente benedetto.

blessing_uri = evaluator.outputs.blessing.get()[0].uri
!ls -l {blessing_uri}
total 0
-rw-rw-r-- 1 kbuilder kbuilder 0 May  4 09:17 BLESSED

Ora puoi anche verificare il successo caricando il record del risultato della convalida:

PATH_TO_RESULT = evaluator.outputs['evaluation'].get()[0].uri
print(tfma.load_validation_result(PATH_TO_RESULT))
validation_ok: true
validation_details {
  slicing_details {
    slicing_spec {
    }
    num_matching_slices: 25
  }
}

Pusher

Il componente Pusher trova solitamente alla fine di una pipeline TFX. Verifica se un modello ha superato la convalida e, in tal caso, esporta il modello in _serving_model_dir .

pusher = Pusher(
    model=trainer.outputs['model'],
    model_blessing=evaluator.outputs['blessing'],
    push_destination=pusher_pb2.PushDestination(
        filesystem=pusher_pb2.PushDestination.Filesystem(
            base_directory=_serving_model_dir)))
context.run(pusher)
INFO:absl:Running driver for Pusher
INFO:absl:MetadataStore with DB connection initialized
INFO:absl:Running executor for Pusher
INFO:absl:Model version: 1620119867
INFO:absl:Model written to serving path /tmp/tmp5r_8to_0/serving_model/taxi_simple/1620119867.
INFO:absl:Model pushed to /tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Pusher/pushed_model/9.
INFO:absl:Running publisher for Pusher
INFO:absl:MetadataStore with DB connection initialized

Esaminiamo gli artefatti di output di Pusher .

pusher.outputs
{'pushed_model': Channel(
    type_name: PushedModel
    artifacts: [Artifact(artifact: id: 12
type_id: 23
uri: "/tmp/tfx-interactive-2021-05-04T09_15_26.254712-39v3w3l2/Pusher/pushed_model/9"
custom_properties {
  key: "name"
  value {
    string_value: "pushed_model"
  }
}
custom_properties {
  key: "producer_component"
  value {
    string_value: "Pusher"
  }
}
custom_properties {
  key: "pushed"
  value {
    int_value: 1
  }
}
custom_properties {
  key: "pushed_destination"
  value {
    string_value: "/tmp/tmp5r_8to_0/serving_model/taxi_simple/1620119867"
  }
}
custom_properties {
  key: "pushed_version"
  value {
    string_value: "1620119867"
  }
}
custom_properties {
  key: "state"
  value {
    string_value: "published"
  }
}
custom_properties {
  key: "tfx_version"
  value {
    string_value: "0.29.0"
  }
}
state: LIVE
, artifact_type: id: 23
name: "PushedModel"
)]
    additional_properties: {}
    additional_custom_properties: {}
)}

In particolare, il Pusher esporterà il tuo modello nel formato SavedModel, che assomiglia a questo:

push_uri = pusher.outputs.pushed_model.get()[0].uri
model = tf.saved_model.load(push_uri)

for item in model.signatures.items():
  pp.pprint(item)
WARNING:tensorflow:9 out of the last 9 calls to <function recreate_function.<locals>.restored_function_body at 0x7fce9ead5158> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for  more details.
('serving_default',
 <ConcreteFunction signature_wrapper(*, examples) at 0x7FCE9FA09D30>)

Abbiamo terminato il nostro tour dei componenti TFX incorporati!