TensorFlow Extended'a (TFX) Bileşen Bileşen Giriş
Bu Colab tabanlı öğretici, TensorFlow Extended (TFX) 'in her yerleşik bileşenini etkileşimli olarak inceleyecektir.
Veri alımından bir modelin hizmete sunulmasına kadar uçtan uca makine öğrenimi hattındaki her adımı kapsar.
İşiniz bittiğinde, bu not defterinin içeriği, Apache Airflow ve Apache Beam ile düzenleyebileceğiniz TFX işlem hattı kaynak kodu olarak otomatik olarak dışa aktarılabilir.
Arka fon
Bu defter, TFX'in Jupyter / Colab ortamında nasıl kullanılacağını gösterir. Burada, etkileşimli bir defterde Chicago Taxi örneğini inceleyeceğiz.
Etkileşimli bir dizüstü bilgisayarda çalışmak, TFX ardışık düzeninin yapısına aşina olmanın yararlı bir yoludur. Hafif bir geliştirme ortamı olarak kendi ardışık düzenlerinizi geliştirirken de yararlıdır, ancak etkileşimli not defterlerinin düzenlenme biçiminde ve meta veri yapılarına nasıl eriştiklerinde farklılıklar olduğunu bilmelisiniz.
Orkestrasyon
TFX'in bir üretim dağıtımında, TFX bileşenlerinin önceden tanımlanmış bir ardışık düzeni grafiğini düzenlemek için Apache Airflow, Kubeflow Pipelines veya Apache Beam gibi bir düzenleyici kullanacaksınız. Etkileşimli bir dizüstü bilgisayarda, dizüstü bilgisayarın kendisi, siz dizüstü bilgisayar hücrelerini çalıştırırken her bir TFX bileşenini çalıştıran bir orkestratördür.
Meta veriler
TFX'in bir üretim dağıtımında, meta verilere ML Meta Verileri (MLMD) API'si aracılığıyla erişeceksiniz. MLMD, meta veri özelliklerini MySQL veya SQLite gibi bir veritabanında depolar ve meta veri yüklerini dosya sisteminiz gibi kalıcı bir depoda saklar. Etkileşimli bir not defterinde, hem özellikler hem de yükler, Jupyter not defteri veya Colab sunucusundaki /tmp
dizinindeki geçici bir SQLite veritabanında depolanır.
Kurmak
Öncelikle gerekli paketleri kurar ve içe aktarırız, yolları belirleriz ve verileri indiririz.
Pip'i Yükselt
Yerel olarak çalışırken bir sistemde Pip'i yükseltmekten kaçınmak için Colab'de çalıştığımızdan emin olun. Yerel sistemler elbette ayrı olarak yükseltilebilir.
try:
import colab
!pip install --upgrade pip
except:
pass
TFX'i yükle
pip install -q -U --use-feature=2020-resolver tfx
Çalışma zamanını yeniden başlattınız mı?
Google Colab kullanıyorsanız, yukarıdaki hücreyi ilk kez çalıştırdığınızda, çalışma zamanını yeniden başlatmanız gerekir (Çalışma Zamanı> Çalışma zamanını yeniden başlat ...). Bunun nedeni, Colab'ın paketleri yükleme biçimidir.
Paketleri içe aktar
Standart TFX bileşen sınıfları dahil gerekli paketleri içe aktarıyoruz.
import os
import pprint
import tempfile
import urllib
import absl
import tensorflow as tf
import tensorflow_model_analysis as tfma
tf.get_logger().propagate = False
pp = pprint.PrettyPrinter()
import tfx
from tfx.components import CsvExampleGen
from tfx.components import Evaluator
from tfx.components import ExampleValidator
from tfx.components import Pusher
from tfx.components import ResolverNode
from tfx.components import SchemaGen
from tfx.components import StatisticsGen
from tfx.components import Trainer
from tfx.components import Transform
from tfx.components.base import executor_spec
from tfx.components.trainer.executor import GenericExecutor
from tfx.dsl.experimental import latest_blessed_model_resolver
from tfx.orchestration import metadata
from tfx.orchestration import pipeline
from tfx.orchestration.experimental.interactive.interactive_context import InteractiveContext
from tfx.proto import pusher_pb2
from tfx.proto import trainer_pb2
from tfx.types import Channel
from tfx.types.standard_artifacts import Model
from tfx.types.standard_artifacts import ModelBlessing
from tfx.utils.dsl_utils import external_input
%load_ext tfx.orchestration.experimental.interactive.notebook_extensions.skip
WARNING:absl:RuntimeParameter is only supported on Cloud-based DAG runner currently.
Kütüphane versiyonlarını kontrol edelim.
print('TensorFlow version: {}'.format(tf.__version__))
print('TFX version: {}'.format(tfx.__version__))
TensorFlow version: 2.3.1 TFX version: 0.25.0
Ardışık düzen yollarını ayarlayın
# This is the root directory for your TFX pip package installation.
_tfx_root = tfx.__path__[0]
# This is the directory containing the TFX Chicago Taxi Pipeline example.
_taxi_root = os.path.join(_tfx_root, 'examples/chicago_taxi_pipeline')
# This is the path where your model will be pushed for serving.
_serving_model_dir = os.path.join(
tempfile.mkdtemp(), 'serving_model/taxi_simple')
# Set up logging.
absl.logging.set_verbosity(absl.logging.INFO)
Örnek verileri indirin
TFX ardışık düzenimizde kullanmak için örnek veri setini indiriyoruz.
Kullandığımız veri kümesi , Chicago Şehri tarafından yayınlanan Taxi Trips veri kümesidir . Bu veri kümesindeki sütunlar şunlardır:
pickup_community_area | Ücret | trip_start_month |
trip_start_hour | trip_start_day | trip_start_timestamp |
pickup_latitude | pickup_longitude | dropoff_latitude |
dropoff_longitude | trip_miles | pickup_census_tract |
dropoff_census_tract | ödeme şekli | şirket |
trip_seconds | dropoff_community_area | ipuçları |
Bu veri kümesiyle, bir gezinin tips
tahmin eden bir model oluşturacağız.
_data_root = tempfile.mkdtemp(prefix='tfx-data')
DATA_PATH = 'https://raw.githubusercontent.com/tensorflow/tfx/master/tfx/examples/chicago_taxi_pipeline/data/simple/data.csv'
_data_filepath = os.path.join(_data_root, "data.csv")
urllib.request.urlretrieve(DATA_PATH, _data_filepath)
('/tmp/tfx-datakmnyv05b/data.csv', <http.client.HTTPMessage at 0x7f1bf0013c50>)
CSV dosyasına hızlıca göz atın.
head {_data_filepath}
pickup_community_area,fare,trip_start_month,trip_start_hour,trip_start_day,trip_start_timestamp,pickup_latitude,pickup_longitude,dropoff_latitude,dropoff_longitude,trip_miles,pickup_census_tract,dropoff_census_tract,payment_type,company,trip_seconds,dropoff_community_area,tips ,12.45,5,19,6,1400269500,,,,,0.0,,,Credit Card,Chicago Elite Cab Corp. (Chicago Carriag,0,,0.0 ,0,3,19,5,1362683700,,,,,0,,,Unknown,Chicago Elite Cab Corp.,300,,0 60,27.05,10,2,3,1380593700,41.836150155,-87.648787952,,,12.6,,,Cash,Taxi Affiliation Services,1380,,0.0 10,5.85,10,1,2,1382319000,41.985015101,-87.804532006,,,0.0,,,Cash,Taxi Affiliation Services,180,,0.0 14,16.65,5,7,5,1369897200,41.968069,-87.721559063,,,0.0,,,Cash,Dispatch Taxi Affiliation,1080,,0.0 13,16.45,11,12,3,1446554700,41.983636307,-87.723583185,,,6.9,,,Cash,,780,,0.0 16,32.05,12,1,1,1417916700,41.953582125,-87.72345239,,,15.4,,,Cash,,1200,,0.0 30,38.45,10,10,5,1444301100,41.839086906,-87.714003807,,,14.6,,,Cash,,2580,,0.0 11,14.65,1,1,3,1358213400,41.978829526,-87.771166703,,,5.81,,,Cash,,1080,,0.0
Sorumluluk Reddi: Bu site, Chicago Şehri'nin resmi web sitesi olan www.cityofchicago.org'daki orijinal kaynağından kullanılmak üzere değiştirilmiş verileri kullanan uygulamalar sağlar. Chicago Şehri, bu sitede sağlanan herhangi bir verinin içeriği, doğruluğu, güncelliği veya eksiksizliği konusunda hiçbir iddiada bulunmaz. Bu sitede sağlanan veriler herhangi bir zamanda değiştirilebilir. Bu sitede sağlanan verilerin riski kişinin kendisine ait olduğu anlaşılmaktadır.
InteractiveContext'i oluşturun
Son olarak, TFX bileşenlerini bu dizüstü bilgisayarda etkileşimli olarak çalıştırmamızı sağlayacak bir InteractiveContext oluşturuyoruz.
# Here, we create an InteractiveContext using default parameters. This will
# use a temporary directory with an ephemeral ML Metadata database instance.
# To use your own pipeline root or database, the optional properties
# `pipeline_root` and `metadata_connection_config` may be passed to
# InteractiveContext. Calls to InteractiveContext are no-ops outside of the
# notebook.
context = InteractiveContext()
WARNING:absl:InteractiveContext pipeline_root argument not provided: using temporary directory /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7 as root for pipeline outputs. WARNING:absl:InteractiveContext metadata_connection_config not provided: using SQLite ML Metadata database at /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/metadata.sqlite.
TFX bileşenlerini etkileşimli olarak çalıştırın
Takip eden hücrelerde tek tek TFX bileşenleri oluşturuyor, her birini çalıştırıyor ve çıktı yapıtlarını görselleştiriyoruz.
ExampleGen
ExampleGen
bileşeni genellikle bir TFX işlem hattının başlangıcındadır. Olacak:
- Verileri eğitim ve değerlendirme setlerine bölün (varsayılan olarak, 2/3 eğitim + 1/3 değerlendirme)
- Verileri
tf.Example
formatına dönüştürün - Diğer bileşenlerin erişmesi için verileri
_tfx_root
dizinine kopyalayın
ExampleGen
, veri kaynağınızın yolunu girdi olarak alır. Bizim durumumuzda bu, indirilen CSV'yi içeren _data_root
yoludur.
example_gen = CsvExampleGen(input=external_input(_data_root))
context.run(example_gen)
WARNING:tensorflow:From <ipython-input-1-2e0190c2dd16>:1: external_input (from tfx.utils.dsl_utils) is deprecated and will be removed in a future version. Instructions for updating: external_input is deprecated, directly pass the uri to ExampleGen. Warning:absl:The "input" argument to the CsvExampleGen component has been deprecated by "input_base". Please update your usage as support for this argument will be removed soon. INFO:absl:Running driver for CsvExampleGen INFO:absl:MetadataStore with DB connection initialized INFO:absl:select span and version = (0, None) INFO:absl:latest span and version = (0, None) INFO:absl:Running executor for CsvExampleGen INFO:absl:Generating examples. WARNING:apache_beam.runners.interactive.interactive_environment:Dependencies required for Interactive Beam PCollection visualization are not available, please use: `pip install apache-beam[interactive]` to install necessary dependencies to enable all data visualization features. INFO:absl:Processing input csv data /tmp/tfx-datakmnyv05b/* to TFExample. WARNING:apache_beam.io.tfrecordio:Couldn't find python-snappy so the implementation of _TFRecordUtil._masked_crc32c is not as fast as it could be. INFO:absl:Examples generated. INFO:absl:Running publisher for CsvExampleGen INFO:absl:MetadataStore with DB connection initialized
ExampleGen
çıktı yapıtlarını inceleyelim. Bu bileşen iki eser, eğitim örnekleri ve değerlendirme örnekleri üretir:
artifact = example_gen.outputs['examples'].get()[0]
print(artifact.split_names, artifact.uri)
["train", "eval"] /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/CsvExampleGen/examples/1
İlk üç eğitim örneğine de göz atabiliriz:
# Get the URI of the output artifact representing the training examples, which is a directory
train_uri = os.path.join(example_gen.outputs['examples'].get()[0].uri, 'train')
# Get the list of files in this directory (all compressed TFRecord files)
tfrecord_filenames = [os.path.join(train_uri, name)
for name in os.listdir(train_uri)]
# Create a `TFRecordDataset` to read these files
dataset = tf.data.TFRecordDataset(tfrecord_filenames, compression_type="GZIP")
# Iterate over the first 3 records and decode them.
for tfrecord in dataset.take(3):
serialized_example = tfrecord.numpy()
example = tf.train.Example()
example.ParseFromString(serialized_example)
pp.pprint(example)
features { feature { key: "company" value { bytes_list { value: "Chicago Elite Cab Corp. (Chicago Carriag" } } } feature { key: "dropoff_census_tract" value { int64_list { } } } feature { key: "dropoff_community_area" value { int64_list { } } } feature { key: "dropoff_latitude" value { float_list { } } } feature { key: "dropoff_longitude" value { float_list { } } } feature { key: "fare" value { float_list { value: 12.449999809265137 } } } feature { key: "payment_type" value { bytes_list { value: "Credit Card" } } } feature { key: "pickup_census_tract" value { int64_list { } } } feature { key: "pickup_community_area" value { int64_list { } } } feature { key: "pickup_latitude" value { float_list { } } } feature { key: "pickup_longitude" value { float_list { } } } feature { key: "tips" value { float_list { value: 0.0 } } } feature { key: "trip_miles" value { float_list { value: 0.0 } } } feature { key: "trip_seconds" value { int64_list { value: 0 } } } feature { key: "trip_start_day" value { int64_list { value: 6 } } } feature { key: "trip_start_hour" value { int64_list { value: 19 } } } feature { key: "trip_start_month" value { int64_list { value: 5 } } } feature { key: "trip_start_timestamp" value { int64_list { value: 1400269500 } } } } features { feature { key: "company" value { bytes_list { value: "Taxi Affiliation Services" } } } feature { key: "dropoff_census_tract" value { int64_list { } } } feature { key: "dropoff_community_area" value { int64_list { } } } feature { key: "dropoff_latitude" value { float_list { } } } feature { key: "dropoff_longitude" value { float_list { } } } feature { key: "fare" value { float_list { value: 27.049999237060547 } } } feature { key: "payment_type" value { bytes_list { value: "Cash" } } } feature { key: "pickup_census_tract" value { int64_list { } } } feature { key: "pickup_community_area" value { int64_list { value: 60 } } } feature { key: "pickup_latitude" value { float_list { value: 41.836151123046875 } } } feature { key: "pickup_longitude" value { float_list { value: -87.64878845214844 } } } feature { key: "tips" value { float_list { value: 0.0 } } } feature { key: "trip_miles" value { float_list { value: 12.600000381469727 } } } feature { key: "trip_seconds" value { int64_list { value: 1380 } } } feature { key: "trip_start_day" value { int64_list { value: 3 } } } feature { key: "trip_start_hour" value { int64_list { value: 2 } } } feature { key: "trip_start_month" value { int64_list { value: 10 } } } feature { key: "trip_start_timestamp" value { int64_list { value: 1380593700 } } } } features { feature { key: "company" value { bytes_list { } } } feature { key: "dropoff_census_tract" value { int64_list { } } } feature { key: "dropoff_community_area" value { int64_list { } } } feature { key: "dropoff_latitude" value { float_list { } } } feature { key: "dropoff_longitude" value { float_list { } } } feature { key: "fare" value { float_list { value: 16.450000762939453 } } } feature { key: "payment_type" value { bytes_list { value: "Cash" } } } feature { key: "pickup_census_tract" value { int64_list { } } } feature { key: "pickup_community_area" value { int64_list { value: 13 } } } feature { key: "pickup_latitude" value { float_list { value: 41.98363494873047 } } } feature { key: "pickup_longitude" value { float_list { value: -87.72357940673828 } } } feature { key: "tips" value { float_list { value: 0.0 } } } feature { key: "trip_miles" value { float_list { value: 6.900000095367432 } } } feature { key: "trip_seconds" value { int64_list { value: 780 } } } feature { key: "trip_start_day" value { int64_list { value: 3 } } } feature { key: "trip_start_hour" value { int64_list { value: 12 } } } feature { key: "trip_start_month" value { int64_list { value: 11 } } } feature { key: "trip_start_timestamp" value { int64_list { value: 1446554700 } } } }
Artık ExampleGen
verileri ExampleGen
bitirdiğine göre, bir sonraki adım veri analizidir.
İstatistikler
StatisticsGen
bileşeni, veri analizi ve aşağı akış bileşenlerinde kullanım için veri kümeniz üzerindeki istatistikleri hesaplar. TensorFlow Veri Doğrulama kitaplığını kullanır.
StatisticsGen
, ExampleGen
kullanarak ExampleGen
aldığımız veri setini girdi olarak alır.
statistics_gen = StatisticsGen(
examples=example_gen.outputs['examples'])
context.run(statistics_gen)
INFO:absl:Excluding no splits because exclude_splits is not set. INFO:absl:Running driver for StatisticsGen INFO:absl:MetadataStore with DB connection initialized INFO:absl:Running executor for StatisticsGen INFO:absl:Generating statistics for split train. INFO:absl:Statistics for split train written to /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/StatisticsGen/statistics/2/train. INFO:absl:Generating statistics for split eval. INFO:absl:Statistics for split eval written to /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/StatisticsGen/statistics/2/eval. INFO:absl:Running publisher for StatisticsGen INFO:absl:MetadataStore with DB connection initialized
StatisticsGen
bitirdikten sonra, çıktısı alınan istatistikleri görselleştirebiliriz. Farklı arazilerle oynamayı deneyin!
context.show(statistics_gen.outputs['statistics'])
WARNING:tensorflow:From /tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow_data_validation/utils/stats_util.py:247: tf_record_iterator (from tensorflow.python.lib.io.tf_record) is deprecated and will be removed in a future version. Instructions for updating: Use eager execution and: `tf.data.TFRecordDataset(path)`
SchemaGen
SchemaGen
bileşeni, veri istatistiklerinize göre bir şema oluşturur. (Bir şema, veri kümenizdeki özelliklerin beklenen sınırlarını, türlerini ve özelliklerini tanımlar.) Ayrıca TensorFlow Veri Doğrulama kitaplığını kullanır.
SchemaGen
, StatisticsGen
ile oluşturduğumuz istatistikleri girdi olarak alacak ve varsayılan olarak eğitim SchemaGen
.
schema_gen = SchemaGen(
statistics=statistics_gen.outputs['statistics'],
infer_feature_shape=False)
context.run(schema_gen)
INFO:absl:Excluding no splits because exclude_splits is not set. INFO:absl:Running driver for SchemaGen INFO:absl:MetadataStore with DB connection initialized INFO:absl:Running executor for SchemaGen INFO:absl:Processing schema from statistics for split train. INFO:absl:Processing schema from statistics for split eval. INFO:absl:Schema written to /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/SchemaGen/schema/3/schema.pbtxt. INFO:absl:Running publisher for SchemaGen INFO:absl:MetadataStore with DB connection initialized
SchemaGen
bitirdikten sonra oluşturulan şemayı tablo olarak görselleştirebiliriz.
context.show(schema_gen.outputs['schema'])
/tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow_data_validation/utils/display_util.py:151: FutureWarning: Passing a negative integer is deprecated in version 1.0 and will not be supported in future version. Instead, use None to not limit the column width. pd.set_option('max_colwidth', -1)
Veri kümenizdeki her özellik, özelliklerinin yanında şema tablosunda bir satır olarak görünür. Şema ayrıca, kategorik bir özelliğin aldığı, etki alanı olarak belirtilen tüm değerleri yakalar.
Şemalar hakkında daha fazla bilgi edinmek için SchemaGen belgelerine bakın .
ExampleValidator
ExampleValidator
bileşeni, şema tarafından tanımlanan beklentilere göre verilerinizdeki anormallikleri tespit eder. Ayrıca TensorFlow Veri Doğrulama kitaplığını kullanır.
ExampleValidator
, StatisticsGen
ve SchemaGen
şemayı girdi olarak SchemaGen
.
example_validator = ExampleValidator(
statistics=statistics_gen.outputs['statistics'],
schema=schema_gen.outputs['schema'])
context.run(example_validator)
INFO:absl:Excluding no splits because exclude_splits is not set. INFO:absl:Running driver for ExampleValidator INFO:absl:MetadataStore with DB connection initialized INFO:absl:Running executor for ExampleValidator INFO:absl:Validating schema against the computed statistics for split train. INFO:absl:Validation complete for split train. Anomalies written to /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/ExampleValidator/anomalies/4/train. INFO:absl:Validating schema against the computed statistics for split eval. INFO:absl:Validation complete for split eval. Anomalies written to /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/ExampleValidator/anomalies/4/eval. INFO:absl:Running publisher for ExampleValidator INFO:absl:MetadataStore with DB connection initialized
ExampleValidator
bitirdikten sonra, anormallikleri tablo olarak görselleştirebiliriz.
context.show(example_validator.outputs['anomalies'])
Anormallikler tablosunda herhangi bir anormallik olmadığını görebiliriz. Bu, analiz ettiğimiz ilk veri kümesi olduğundan ve şema buna göre uyarlandığından beklediğimiz şey budur. Bu şemayı gözden geçirmelisiniz - beklenmedik herhangi bir şey, verilerde bir anormallik olduğu anlamına gelir. Şema incelendikten sonra gelecekteki verileri korumak için kullanılabilir ve burada üretilen anormallikler model performansında hata ayıklamak, verilerinizin zaman içinde nasıl geliştiğini anlamak ve veri hatalarını tanımlamak için kullanılabilir.
Dönüştürme
Transform
bileşeni, hem eğitim hem de hizmet için özellik mühendisliği gerçekleştirir. TensorFlow Transform kütüphanesini kullanır.
Transform
, girdi olarak ExampleGen
verileri, ExampleGen
şemayı ve kullanıcı tanımlı Dönüşüm kodunu içeren bir modülü SchemaGen
.
Aşağıda kullanıcı tanımlı Dönüşüm kodunun bir örneğini görelim (TensorFlow Transform API'lerine giriş için eğiticiye bakın ). İlk olarak, özellik mühendisliği için birkaç sabit tanımlıyoruz:
_taxi_constants_module_file = 'taxi_constants.py'
%%writefile {_taxi_constants_module_file}
# Categorical features are assumed to each have a maximum value in the dataset.
MAX_CATEGORICAL_FEATURE_VALUES = [24, 31, 12]
CATEGORICAL_FEATURE_KEYS = [
'trip_start_hour', 'trip_start_day', 'trip_start_month',
'pickup_census_tract', 'dropoff_census_tract', 'pickup_community_area',
'dropoff_community_area'
]
DENSE_FLOAT_FEATURE_KEYS = ['trip_miles', 'fare', 'trip_seconds']
# Number of buckets used by tf.transform for encoding each feature.
FEATURE_BUCKET_COUNT = 10
BUCKET_FEATURE_KEYS = [
'pickup_latitude', 'pickup_longitude', 'dropoff_latitude',
'dropoff_longitude'
]
# Number of vocabulary terms used for encoding VOCAB_FEATURES by tf.transform
VOCAB_SIZE = 1000
# Count of out-of-vocab buckets in which unrecognized VOCAB_FEATURES are hashed.
OOV_SIZE = 10
VOCAB_FEATURE_KEYS = [
'payment_type',
'company',
]
# Keys
LABEL_KEY = 'tips'
FARE_KEY = 'fare'
def transformed_name(key):
return key + '_xf'
Writing taxi_constants.py
Ardından, girdi olarak ham verileri alan ve modelimizin eğitim alabileceği dönüştürülmüş özellikleri döndüren bir preprocessing_fn
yazıyoruz:
_taxi_transform_module_file = 'taxi_transform.py'
%%writefile {_taxi_transform_module_file}
import tensorflow as tf
import tensorflow_transform as tft
import taxi_constants
_DENSE_FLOAT_FEATURE_KEYS = taxi_constants.DENSE_FLOAT_FEATURE_KEYS
_VOCAB_FEATURE_KEYS = taxi_constants.VOCAB_FEATURE_KEYS
_VOCAB_SIZE = taxi_constants.VOCAB_SIZE
_OOV_SIZE = taxi_constants.OOV_SIZE
_FEATURE_BUCKET_COUNT = taxi_constants.FEATURE_BUCKET_COUNT
_BUCKET_FEATURE_KEYS = taxi_constants.BUCKET_FEATURE_KEYS
_CATEGORICAL_FEATURE_KEYS = taxi_constants.CATEGORICAL_FEATURE_KEYS
_FARE_KEY = taxi_constants.FARE_KEY
_LABEL_KEY = taxi_constants.LABEL_KEY
_transformed_name = taxi_constants.transformed_name
def preprocessing_fn(inputs):
"""tf.transform's callback function for preprocessing inputs.
Args:
inputs: map from feature keys to raw not-yet-transformed features.
Returns:
Map from string feature key to transformed feature operations.
"""
outputs = {}
for key in _DENSE_FLOAT_FEATURE_KEYS:
# Preserve this feature as a dense float, setting nan's to the mean.
outputs[_transformed_name(key)] = tft.scale_to_z_score(
_fill_in_missing(inputs[key]))
for key in _VOCAB_FEATURE_KEYS:
# Build a vocabulary for this feature.
outputs[_transformed_name(key)] = tft.compute_and_apply_vocabulary(
_fill_in_missing(inputs[key]),
top_k=_VOCAB_SIZE,
num_oov_buckets=_OOV_SIZE)
for key in _BUCKET_FEATURE_KEYS:
outputs[_transformed_name(key)] = tft.bucketize(
_fill_in_missing(inputs[key]), _FEATURE_BUCKET_COUNT)
for key in _CATEGORICAL_FEATURE_KEYS:
outputs[_transformed_name(key)] = _fill_in_missing(inputs[key])
# Was this passenger a big tipper?
taxi_fare = _fill_in_missing(inputs[_FARE_KEY])
tips = _fill_in_missing(inputs[_LABEL_KEY])
outputs[_transformed_name(_LABEL_KEY)] = tf.where(
tf.math.is_nan(taxi_fare),
tf.cast(tf.zeros_like(taxi_fare), tf.int64),
# Test if the tip was > 20% of the fare.
tf.cast(
tf.greater(tips, tf.multiply(taxi_fare, tf.constant(0.2))), tf.int64))
return outputs
def _fill_in_missing(x):
"""Replace missing values in a SparseTensor.
Fills in missing values of `x` with '' or 0, and converts to a dense tensor.
Args:
x: A `SparseTensor` of rank 2. Its dense shape should have size at most 1
in the second dimension.
Returns:
A rank 1 tensor where missing values of `x` have been filled in.
"""
default_value = '' if x.dtype == tf.string else 0
return tf.squeeze(
tf.sparse.to_dense(
tf.SparseTensor(x.indices, x.values, [x.dense_shape[0], 1]),
default_value),
axis=1)
Writing taxi_transform.py
Şimdi, bu özellik mühendisliği kodunu Transform
bileşenine aktarıyoruz ve verilerinizi dönüştürmek için çalıştırıyoruz.
transform = Transform(
examples=example_gen.outputs['examples'],
schema=schema_gen.outputs['schema'],
module_file=os.path.abspath(_taxi_transform_module_file))
context.run(transform)
INFO:absl:Running driver for Transform INFO:absl:MetadataStore with DB connection initialized INFO:absl:Running executor for Transform INFO:absl:Analyze the 'train' split and transform all splits when splits_config is not set. Warning:tensorflow:From /tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tfx/components/transform/executor.py:528: Schema (from tensorflow_transform.tf_metadata.dataset_schema) is deprecated and will be removed in a future version. Instructions for updating: Schema is a deprecated, use schema_utils.schema_from_feature_spec to create a `Schema` INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor. Warning:tensorflow:From /tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow_transform/tf_utils.py:250: Tensor.experimental_ref (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version. Instructions for updating: Use ref() instead. INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor. Warning:tensorflow:TFT beam APIs accept both the TFXIO format and the instance dict format now. There is no need to set use_tfxio any more and it will be removed soon. Warning:root:This output type hint will be ignored and not used for type-checking purposes. Typically, output type hints for a PTransform are single (or nested) types wrapped by a PCollection, PDone, or None. Got: Tuple[Dict[str, Union[NoneType, _Dataset]], Union[Dict[str, Dict[str, PCollection]], NoneType]] instead. WARNING:root:This output type hint will be ignored and not used for type-checking purposes. Typically, output type hints for a PTransform are single (or nested) types wrapped by a PCollection, PDone, or None. Got: Tuple[Dict[str, Union[NoneType, _Dataset]], Union[Dict[str, Dict[str, PCollection]], NoneType]] instead. Warning:tensorflow:Tensorflow version (2.3.1) found. Note that Tensorflow Transform support for TF 2.0 is currently in beta, and features such as tf.function may not work as intended. WARNING:tensorflow:From /tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/saved_model/signature_def_utils_impl.py:201: build_tensor_info (from tensorflow.python.saved_model.utils_impl) is deprecated and will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.utils.build_tensor_info or tf.compat.v1.saved_model.build_tensor_info. INFO:tensorflow:Assets added to graph. INFO:tensorflow:No assets to write. WARNING:tensorflow:Issue encountered when serializing tft_mapper_use. Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore. 'Counter' object has no attribute 'name' INFO:tensorflow:SavedModel written to: /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Transform/transform_graph/5/.temp_path/tftransform_tmp/58a3a95f8e3e4001b79596dca5b9fd7f/saved_model.pb INFO:tensorflow:Assets added to graph. INFO:tensorflow:No assets to write. WARNING:tensorflow:Issue encountered when serializing tft_mapper_use. Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore. 'Counter' object has no attribute 'name' INFO:tensorflow:SavedModel written to: /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Transform/transform_graph/5/.temp_path/tftransform_tmp/2d933b65462c4a3a92c0fe939c572176/saved_model.pb INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor. Warning:tensorflow:Tensorflow version (2.3.1) found. Note that Tensorflow Transform support for TF 2.0 is currently in beta, and features such as tf.function may not work as intended. Warning:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'> WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'> WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'> WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'> WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'> WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'> INFO:absl:Feature company has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature dropoff_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature fare has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature payment_type has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_census_tract has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_community_area has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_latitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature pickup_longitude has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature tips has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_miles has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_seconds has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_day has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_hour has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_month has no shape. Setting to VarLenSparseTensor. INFO:absl:Feature trip_start_timestamp has no shape. Setting to VarLenSparseTensor. Warning:tensorflow:Tensorflow version (2.3.1) found. Note that Tensorflow Transform support for TF 2.0 is currently in beta, and features such as tf.function may not work as intended. Warning:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'> WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'> WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'> WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'> WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'> WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'> INFO:tensorflow:Saver not created because there are no variables in the graph to restore INFO:tensorflow:Saver not created because there are no variables in the graph to restore INFO:tensorflow:Assets added to graph. INFO:tensorflow:Assets written to: /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Transform/transform_graph/5/.temp_path/tftransform_tmp/81295dd0aa784768b7922ec85f54b0ce/assets INFO:tensorflow:SavedModel written to: /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Transform/transform_graph/5/.temp_path/tftransform_tmp/81295dd0aa784768b7922ec85f54b0ce/saved_model.pb WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef" value: "\n\013\n\tConst_2:0\022-vocab_compute_and_apply_vocabulary_vocabulary" Warning:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef" value: "\n\013\n\tConst_4:0\022/vocab_compute_and_apply_vocabulary_1_vocabulary" INFO:tensorflow:Saver not created because there are no variables in the graph to restore WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef" value: "\n\013\n\tConst_2:0\022-vocab_compute_and_apply_vocabulary_vocabulary" Warning:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef" value: "\n\013\n\tConst_4:0\022/vocab_compute_and_apply_vocabulary_1_vocabulary" INFO:tensorflow:Saver not created because there are no variables in the graph to restore WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef" value: "\n\013\n\tConst_2:0\022-vocab_compute_and_apply_vocabulary_vocabulary" Warning:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef" value: "\n\013\n\tConst_4:0\022/vocab_compute_and_apply_vocabulary_1_vocabulary" INFO:tensorflow:Saver not created because there are no variables in the graph to restore INFO:absl:Running publisher for Transform INFO:absl:MetadataStore with DB connection initialized
Transform
çıktı yapıtlarını inceleyelim. Bu bileşen iki tür çıktı üretir:
-
transform_graph
, önişleme işlemlerini gerçekleştirebilen grafiktir (bu grafik, sunum ve değerlendirme modellerine dahil edilecektir). -
transformed_examples
, önceden işlenmiş eğitim ve değerlendirme verilerini temsil eder.
transform.outputs
{'transform_graph': Channel( type_name: TransformGraph artifacts: [Artifact(artifact: id: 5 type_id: 13 uri: "/tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Transform/transform_graph/5" custom_properties { key: "name" value { string_value: "transform_graph" } } custom_properties { key: "producer_component" value { string_value: "Transform" } } custom_properties { key: "state" value { string_value: "published" } } state: LIVE , artifact_type: id: 13 name: "TransformGraph" )] ), 'transformed_examples': Channel( type_name: Examples artifacts: [Artifact(artifact: id: 6 type_id: 5 uri: "/tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Transform/transformed_examples/5" properties { key: "split_names" value { string_value: "[\"train\", \"eval\"]" } } custom_properties { key: "name" value { string_value: "transformed_examples" } } custom_properties { key: "producer_component" value { string_value: "Transform" } } custom_properties { key: "state" value { string_value: "published" } } state: LIVE , artifact_type: id: 5 name: "Examples" properties { key: "span" value: INT } properties { key: "split_names" value: STRING } properties { key: "version" value: INT } )] ), 'updated_analyzer_cache': Channel( type_name: TransformCache artifacts: [Artifact(artifact: id: 7 type_id: 14 uri: "/tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Transform/updated_analyzer_cache/5" custom_properties { key: "name" value { string_value: "updated_analyzer_cache" } } custom_properties { key: "producer_component" value { string_value: "Transform" } } custom_properties { key: "state" value { string_value: "published" } } state: LIVE , artifact_type: id: 14 name: "TransformCache" )] )}
transform_graph
yapısına bir göz atın. Üç alt dizin içeren bir dizini gösterir.
train_uri = transform.outputs['transform_graph'].get()[0].uri
os.listdir(train_uri)
['transform_fn', 'transformed_metadata', 'metadata']
transformed_metadata
alt dizini, önceden işlenmiş verilerin şemasını içerir. transform_fn
alt dizini, gerçek ön işleme grafiğini içerir. metadata
alt dizini, orijinal verilerin şemasını içerir.
Ayrıca dönüştürülmüş ilk üç örneğe de göz atabiliriz:
# Get the URI of the output artifact representing the transformed examples, which is a directory
train_uri = os.path.join(transform.outputs['transformed_examples'].get()[0].uri, 'train')
# Get the list of files in this directory (all compressed TFRecord files)
tfrecord_filenames = [os.path.join(train_uri, name)
for name in os.listdir(train_uri)]
# Create a `TFRecordDataset` to read these files
dataset = tf.data.TFRecordDataset(tfrecord_filenames, compression_type="GZIP")
# Iterate over the first 3 records and decode them.
for tfrecord in dataset.take(3):
serialized_example = tfrecord.numpy()
example = tf.train.Example()
example.ParseFromString(serialized_example)
pp.pprint(example)
features { feature { key: "company_xf" value { int64_list { value: 8 } } } feature { key: "dropoff_census_tract_xf" value { int64_list { value: 0 } } } feature { key: "dropoff_community_area_xf" value { int64_list { value: 0 } } } feature { key: "dropoff_latitude_xf" value { int64_list { value: 0 } } } feature { key: "dropoff_longitude_xf" value { int64_list { value: 9 } } } feature { key: "fare_xf" value { float_list { value: 0.06106060370802879 } } } feature { key: "payment_type_xf" value { int64_list { value: 1 } } } feature { key: "pickup_census_tract_xf" value { int64_list { value: 0 } } } feature { key: "pickup_community_area_xf" value { int64_list { value: 0 } } } feature { key: "pickup_latitude_xf" value { int64_list { value: 0 } } } feature { key: "pickup_longitude_xf" value { int64_list { value: 9 } } } feature { key: "tips_xf" value { int64_list { value: 0 } } } feature { key: "trip_miles_xf" value { float_list { value: -0.15886740386486053 } } } feature { key: "trip_seconds_xf" value { float_list { value: -0.7118486762046814 } } } feature { key: "trip_start_day_xf" value { int64_list { value: 6 } } } feature { key: "trip_start_hour_xf" value { int64_list { value: 19 } } } feature { key: "trip_start_month_xf" value { int64_list { value: 5 } } } } features { feature { key: "company_xf" value { int64_list { value: 0 } } } feature { key: "dropoff_census_tract_xf" value { int64_list { value: 0 } } } feature { key: "dropoff_community_area_xf" value { int64_list { value: 0 } } } feature { key: "dropoff_latitude_xf" value { int64_list { value: 0 } } } feature { key: "dropoff_longitude_xf" value { int64_list { value: 9 } } } feature { key: "fare_xf" value { float_list { value: 1.2521241903305054 } } } feature { key: "payment_type_xf" value { int64_list { value: 0 } } } feature { key: "pickup_census_tract_xf" value { int64_list { value: 0 } } } feature { key: "pickup_community_area_xf" value { int64_list { value: 60 } } } feature { key: "pickup_latitude_xf" value { int64_list { value: 0 } } } feature { key: "pickup_longitude_xf" value { int64_list { value: 3 } } } feature { key: "tips_xf" value { int64_list { value: 0 } } } feature { key: "trip_miles_xf" value { float_list { value: 0.532160758972168 } } } feature { key: "trip_seconds_xf" value { float_list { value: 0.5509493947029114 } } } feature { key: "trip_start_day_xf" value { int64_list { value: 3 } } } feature { key: "trip_start_hour_xf" value { int64_list { value: 2 } } } feature { key: "trip_start_month_xf" value { int64_list { value: 10 } } } } features { feature { key: "company_xf" value { int64_list { value: 48 } } } feature { key: "dropoff_census_tract_xf" value { int64_list { value: 0 } } } feature { key: "dropoff_community_area_xf" value { int64_list { value: 0 } } } feature { key: "dropoff_latitude_xf" value { int64_list { value: 0 } } } feature { key: "dropoff_longitude_xf" value { int64_list { value: 9 } } } feature { key: "fare_xf" value { float_list { value: 0.3873794972896576 } } } feature { key: "payment_type_xf" value { int64_list { value: 0 } } } feature { key: "pickup_census_tract_xf" value { int64_list { value: 0 } } } feature { key: "pickup_community_area_xf" value { int64_list { value: 13 } } } feature { key: "pickup_latitude_xf" value { int64_list { value: 9 } } } feature { key: "pickup_longitude_xf" value { int64_list { value: 0 } } } feature { key: "tips_xf" value { int64_list { value: 0 } } } feature { key: "trip_miles_xf" value { float_list { value: 0.21955278515815735 } } } feature { key: "trip_seconds_xf" value { float_list { value: 0.0019067703979089856 } } } feature { key: "trip_start_day_xf" value { int64_list { value: 3 } } } feature { key: "trip_start_hour_xf" value { int64_list { value: 12 } } } feature { key: "trip_start_month_xf" value { int64_list { value: 11 } } } }
Transform
bileşeni verilerinizi özelliklere dönüştürdükten sonra, sonraki adım bir model eğitmektir.
Eğitimci
Trainer
bileşeni, TensorFlow'da tanımladığınız bir modeli eğitir. Varsayılan Eğitmen desteği Tahmin API'sı, Keras API'sini kullanmak için, Eğiticinin oluşturucusunda custom_executor_spec=executor_spec.ExecutorClassSpec(GenericExecutor)
kurulumunu custom_executor_spec=executor_spec.ExecutorClassSpec(GenericExecutor)
Generic Trainer'ı belirtmeniz gerekir.
Trainer
girdi olarak şemadan alır SchemaGen
, dönüştürülmüş veriler ve grafik Transform
parametreleri, yanı sıra kullanıcı tanımlı model kodunu içeren bir modül, eğitim.
Aşağıda kullanıcı tanımlı model kodunun bir örneğini görelim (TensorFlow Keras API'lerine giriş için eğiticiye bakın ):
_taxi_trainer_module_file = 'taxi_trainer.py'
%%writefile {_taxi_trainer_module_file}
from typing import List, Text
import os
import absl
import datetime
import tensorflow as tf
import tensorflow_transform as tft
from tfx.components.trainer.executor import TrainerFnArgs
from tfx.components.trainer.fn_args_utils import DataAccessor
from tfx_bsl.tfxio import dataset_options
import taxi_constants
_DENSE_FLOAT_FEATURE_KEYS = taxi_constants.DENSE_FLOAT_FEATURE_KEYS
_VOCAB_FEATURE_KEYS = taxi_constants.VOCAB_FEATURE_KEYS
_VOCAB_SIZE = taxi_constants.VOCAB_SIZE
_OOV_SIZE = taxi_constants.OOV_SIZE
_FEATURE_BUCKET_COUNT = taxi_constants.FEATURE_BUCKET_COUNT
_BUCKET_FEATURE_KEYS = taxi_constants.BUCKET_FEATURE_KEYS
_CATEGORICAL_FEATURE_KEYS = taxi_constants.CATEGORICAL_FEATURE_KEYS
_MAX_CATEGORICAL_FEATURE_VALUES = taxi_constants.MAX_CATEGORICAL_FEATURE_VALUES
_LABEL_KEY = taxi_constants.LABEL_KEY
_transformed_name = taxi_constants.transformed_name
def _transformed_names(keys):
return [_transformed_name(key) for key in keys]
def _get_serve_tf_examples_fn(model, tf_transform_output):
"""Returns a function that parses a serialized tf.Example and applies TFT."""
model.tft_layer = tf_transform_output.transform_features_layer()
@tf.function
def serve_tf_examples_fn(serialized_tf_examples):
"""Returns the output to be used in the serving signature."""
feature_spec = tf_transform_output.raw_feature_spec()
feature_spec.pop(_LABEL_KEY)
parsed_features = tf.io.parse_example(serialized_tf_examples, feature_spec)
transformed_features = model.tft_layer(parsed_features)
return model(transformed_features)
return serve_tf_examples_fn
def _input_fn(file_pattern: List[Text],
data_accessor: DataAccessor,
tf_transform_output: tft.TFTransformOutput,
batch_size: int = 200) -> tf.data.Dataset:
"""Generates features and label for tuning/training.
Args:
file_pattern: List of paths or patterns of input tfrecord files.
data_accessor: DataAccessor for converting input to RecordBatch.
tf_transform_output: A TFTransformOutput.
batch_size: representing the number of consecutive elements of returned
dataset to combine in a single batch
Returns:
A dataset that contains (features, indices) tuple where features is a
dictionary of Tensors, and indices is a single Tensor of label indices.
"""
return data_accessor.tf_dataset_factory(
file_pattern,
dataset_options.TensorFlowDatasetOptions(
batch_size=batch_size, label_key=_transformed_name(_LABEL_KEY)),
tf_transform_output.transformed_metadata.schema)
def _build_keras_model(hidden_units: List[int] = None) -> tf.keras.Model:
"""Creates a DNN Keras model for classifying taxi data.
Args:
hidden_units: [int], the layer sizes of the DNN (input layer first).
Returns:
A keras Model.
"""
real_valued_columns = [
tf.feature_column.numeric_column(key, shape=())
for key in _transformed_names(_DENSE_FLOAT_FEATURE_KEYS)
]
categorical_columns = [
tf.feature_column.categorical_column_with_identity(
key, num_buckets=_VOCAB_SIZE + _OOV_SIZE, default_value=0)
for key in _transformed_names(_VOCAB_FEATURE_KEYS)
]
categorical_columns += [
tf.feature_column.categorical_column_with_identity(
key, num_buckets=_FEATURE_BUCKET_COUNT, default_value=0)
for key in _transformed_names(_BUCKET_FEATURE_KEYS)
]
categorical_columns += [
tf.feature_column.categorical_column_with_identity( # pylint: disable=g-complex-comprehension
key,
num_buckets=num_buckets,
default_value=0) for key, num_buckets in zip(
_transformed_names(_CATEGORICAL_FEATURE_KEYS),
_MAX_CATEGORICAL_FEATURE_VALUES)
]
indicator_column = [
tf.feature_column.indicator_column(categorical_column)
for categorical_column in categorical_columns
]
model = _wide_and_deep_classifier(
# TODO(b/139668410) replace with premade wide_and_deep keras model
wide_columns=indicator_column,
deep_columns=real_valued_columns,
dnn_hidden_units=hidden_units or [100, 70, 50, 25])
return model
def _wide_and_deep_classifier(wide_columns, deep_columns, dnn_hidden_units):
"""Build a simple keras wide and deep model.
Args:
wide_columns: Feature columns wrapped in indicator_column for wide (linear)
part of the model.
deep_columns: Feature columns for deep part of the model.
dnn_hidden_units: [int], the layer sizes of the hidden DNN.
Returns:
A Wide and Deep Keras model
"""
# Following values are hard coded for simplicity in this example,
# However prefarably they should be passsed in as hparams.
# Keras needs the feature definitions at compile time.
# TODO(b/139081439): Automate generation of input layers from FeatureColumn.
input_layers = {
colname: tf.keras.layers.Input(name=colname, shape=(), dtype=tf.float32)
for colname in _transformed_names(_DENSE_FLOAT_FEATURE_KEYS)
}
input_layers.update({
colname: tf.keras.layers.Input(name=colname, shape=(), dtype='int32')
for colname in _transformed_names(_VOCAB_FEATURE_KEYS)
})
input_layers.update({
colname: tf.keras.layers.Input(name=colname, shape=(), dtype='int32')
for colname in _transformed_names(_BUCKET_FEATURE_KEYS)
})
input_layers.update({
colname: tf.keras.layers.Input(name=colname, shape=(), dtype='int32')
for colname in _transformed_names(_CATEGORICAL_FEATURE_KEYS)
})
# TODO(b/161952382): Replace with Keras preprocessing layers.
deep = tf.keras.layers.DenseFeatures(deep_columns)(input_layers)
for numnodes in dnn_hidden_units:
deep = tf.keras.layers.Dense(numnodes)(deep)
wide = tf.keras.layers.DenseFeatures(wide_columns)(input_layers)
output = tf.keras.layers.Dense(
1, activation='sigmoid')(
tf.keras.layers.concatenate([deep, wide]))
model = tf.keras.Model(input_layers, output)
model.compile(
loss='binary_crossentropy',
optimizer=tf.keras.optimizers.Adam(lr=0.001),
metrics=[tf.keras.metrics.BinaryAccuracy()])
model.summary(print_fn=absl.logging.info)
return model
# TFX Trainer will call this function.
def run_fn(fn_args: TrainerFnArgs):
"""Train the model based on given args.
Args:
fn_args: Holds args used to train the model as name/value pairs.
"""
# Number of nodes in the first layer of the DNN
first_dnn_layer_size = 100
num_dnn_layers = 4
dnn_decay_factor = 0.7
tf_transform_output = tft.TFTransformOutput(fn_args.transform_output)
train_dataset = _input_fn(fn_args.train_files, fn_args.data_accessor,
tf_transform_output, 40)
eval_dataset = _input_fn(fn_args.eval_files, fn_args.data_accessor,
tf_transform_output, 40)
model = _build_keras_model(
# Construct layers sizes with exponetial decay
hidden_units=[
max(2, int(first_dnn_layer_size * dnn_decay_factor**i))
for i in range(num_dnn_layers)
])
tensorboard_callback = tf.keras.callbacks.TensorBoard(
log_dir=fn_args.model_run_dir, update_freq='batch')
model.fit(
train_dataset,
steps_per_epoch=fn_args.train_steps,
validation_data=eval_dataset,
validation_steps=fn_args.eval_steps,
callbacks=[tensorboard_callback])
signatures = {
'serving_default':
_get_serve_tf_examples_fn(model,
tf_transform_output).get_concrete_function(
tf.TensorSpec(
shape=[None],
dtype=tf.string,
name='examples')),
}
model.save(fn_args.serving_model_dir, save_format='tf', signatures=signatures)
Writing taxi_trainer.py
Şimdi, bu model kodunu Trainer
bileşenine aktarıyoruz ve modeli eğitmek için çalıştırıyoruz.
trainer = Trainer(
module_file=os.path.abspath(_taxi_trainer_module_file),
custom_executor_spec=executor_spec.ExecutorClassSpec(GenericExecutor),
examples=transform.outputs['transformed_examples'],
transform_graph=transform.outputs['transform_graph'],
schema=schema_gen.outputs['schema'],
train_args=trainer_pb2.TrainArgs(num_steps=10000),
eval_args=trainer_pb2.EvalArgs(num_steps=5000))
context.run(trainer)
WARNING:tensorflow:From <ipython-input-1-47ec22ebd3e4>:3: The name tfx.components.base.executor_spec.ExecutorClassSpec is deprecated. Please use tfx.dsl.components.base.executor_spec.ExecutorClassSpec instead. INFO:absl:Running driver for Trainer INFO:absl:MetadataStore with DB connection initialized INFO:absl:Running executor for Trainer INFO:absl:Train on the 'train' split when train_args.splits is not set. INFO:absl:Evaluate on the 'eval' split when eval_args.splits is not set. WARNING:absl:Examples artifact does not have payload_format custom property. Falling back to FORMAT_TF_EXAMPLE WARNING:absl:Examples artifact does not have payload_format custom property. Falling back to FORMAT_TF_EXAMPLE INFO:absl:Training model. INFO:absl:Feature company_xf has a shape . Setting to DenseTensor. INFO:absl:Feature dropoff_census_tract_xf has a shape . Setting to DenseTensor. INFO:absl:Feature dropoff_community_area_xf has a shape . Setting to DenseTensor. INFO:absl:Feature dropoff_latitude_xf has a shape . Setting to DenseTensor. INFO:absl:Feature dropoff_longitude_xf has a shape . Setting to DenseTensor. INFO:absl:Feature fare_xf has a shape . Setting to DenseTensor. INFO:absl:Feature payment_type_xf has a shape . Setting to DenseTensor. INFO:absl:Feature pickup_census_tract_xf has a shape . Setting to DenseTensor. INFO:absl:Feature pickup_community_area_xf has a shape . Setting to DenseTensor. INFO:absl:Feature pickup_latitude_xf has a shape . Setting to DenseTensor. INFO:absl:Feature pickup_longitude_xf has a shape . Setting to DenseTensor. INFO:absl:Feature tips_xf has a shape . Setting to DenseTensor. INFO:absl:Feature trip_miles_xf has a shape . Setting to DenseTensor. INFO:absl:Feature trip_seconds_xf has a shape . Setting to DenseTensor. INFO:absl:Feature trip_start_day_xf has a shape . Setting to DenseTensor. INFO:absl:Feature trip_start_hour_xf has a shape . Setting to DenseTensor. INFO:absl:Feature trip_start_month_xf has a shape . Setting to DenseTensor. INFO:absl:Feature company_xf has a shape . Setting to DenseTensor. INFO:absl:Feature dropoff_census_tract_xf has a shape . Setting to DenseTensor. INFO:absl:Feature dropoff_community_area_xf has a shape . Setting to DenseTensor. INFO:absl:Feature dropoff_latitude_xf has a shape . Setting to DenseTensor. INFO:absl:Feature dropoff_longitude_xf has a shape . Setting to DenseTensor. INFO:absl:Feature fare_xf has a shape . Setting to DenseTensor. INFO:absl:Feature payment_type_xf has a shape . Setting to DenseTensor. INFO:absl:Feature pickup_census_tract_xf has a shape . Setting to DenseTensor. INFO:absl:Feature pickup_community_area_xf has a shape . Setting to DenseTensor. INFO:absl:Feature pickup_latitude_xf has a shape . Setting to DenseTensor. INFO:absl:Feature pickup_longitude_xf has a shape . Setting to DenseTensor. INFO:absl:Feature tips_xf has a shape . Setting to DenseTensor. INFO:absl:Feature trip_miles_xf has a shape . Setting to DenseTensor. INFO:absl:Feature trip_seconds_xf has a shape . Setting to DenseTensor. INFO:absl:Feature trip_start_day_xf has a shape . Setting to DenseTensor. INFO:absl:Feature trip_start_hour_xf has a shape . Setting to DenseTensor. INFO:absl:Feature trip_start_month_xf has a shape . Setting to DenseTensor. INFO:absl:Model: "functional_1" INFO:absl:__________________________________________________________________________________________________ INFO:absl:Layer (type) Output Shape Param # Connected to INFO:absl:================================================================================================== INFO:absl:company_xf (InputLayer) [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:dropoff_census_tract_xf (InputL [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:dropoff_community_area_xf (Inpu [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:dropoff_latitude_xf (InputLayer [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:dropoff_longitude_xf (InputLaye [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:fare_xf (InputLayer) [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:payment_type_xf (InputLayer) [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:pickup_census_tract_xf (InputLa [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:pickup_community_area_xf (Input [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:pickup_latitude_xf (InputLayer) [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:pickup_longitude_xf (InputLayer [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:trip_miles_xf (InputLayer) [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:trip_seconds_xf (InputLayer) [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:trip_start_day_xf (InputLayer) [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:trip_start_hour_xf (InputLayer) [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:trip_start_month_xf (InputLayer [(None,)] 0 INFO:absl:__________________________________________________________________________________________________ INFO:absl:dense_features (DenseFeatures) (None, 3) 0 company_xf[0][0] INFO:absl: dropoff_census_tract_xf[0][0] INFO:absl: dropoff_community_area_xf[0][0] INFO:absl: dropoff_latitude_xf[0][0] INFO:absl: dropoff_longitude_xf[0][0] INFO:absl: fare_xf[0][0] INFO:absl: payment_type_xf[0][0] INFO:absl: pickup_census_tract_xf[0][0] INFO:absl: pickup_community_area_xf[0][0] INFO:absl: pickup_latitude_xf[0][0] INFO:absl: pickup_longitude_xf[0][0] INFO:absl: trip_miles_xf[0][0] INFO:absl: trip_seconds_xf[0][0] INFO:absl: trip_start_day_xf[0][0] INFO:absl: trip_start_hour_xf[0][0] INFO:absl: trip_start_month_xf[0][0] INFO:absl:__________________________________________________________________________________________________ INFO:absl:dense (Dense) (None, 100) 400 dense_features[0][0] INFO:absl:__________________________________________________________________________________________________ INFO:absl:dense_1 (Dense) (None, 70) 7070 dense[0][0] INFO:absl:__________________________________________________________________________________________________ INFO:absl:dense_2 (Dense) (None, 48) 3408 dense_1[0][0] INFO:absl:__________________________________________________________________________________________________ INFO:absl:dense_3 (Dense) (None, 34) 1666 dense_2[0][0] INFO:absl:__________________________________________________________________________________________________ INFO:absl:dense_features_1 (DenseFeatures (None, 2127) 0 company_xf[0][0] INFO:absl: dropoff_census_tract_xf[0][0] INFO:absl: dropoff_community_area_xf[0][0] INFO:absl: dropoff_latitude_xf[0][0] INFO:absl: dropoff_longitude_xf[0][0] INFO:absl: fare_xf[0][0] INFO:absl: payment_type_xf[0][0] INFO:absl: pickup_census_tract_xf[0][0] INFO:absl: pickup_community_area_xf[0][0] INFO:absl: pickup_latitude_xf[0][0] INFO:absl: pickup_longitude_xf[0][0] INFO:absl: trip_miles_xf[0][0] INFO:absl: trip_seconds_xf[0][0] INFO:absl: trip_start_day_xf[0][0] INFO:absl: trip_start_hour_xf[0][0] INFO:absl: trip_start_month_xf[0][0] INFO:absl:__________________________________________________________________________________________________ INFO:absl:concatenate (Concatenate) (None, 2161) 0 dense_3[0][0] INFO:absl: dense_features_1[0][0] INFO:absl:__________________________________________________________________________________________________ INFO:absl:dense_4 (Dense) (None, 1) 2162 concatenate[0][0] INFO:absl:================================================================================================== INFO:absl:Total params: 14,706 INFO:absl:Trainable params: 14,706 INFO:absl:Non-trainable params: 0 INFO:absl:__________________________________________________________________________________________________ 1/10000 [..............................] - ETA: 3s - loss: 0.6688 - binary_accuracy: 0.7000WARNING:tensorflow:From /tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/ops/summary_ops_v2.py:1277: stop (from tensorflow.python.eager.profiler) is deprecated and will be removed after 2020-07-01. Instructions for updating: use `tf.profiler.experimental.stop` instead. WARNING:tensorflow:Callbacks method `on_train_batch_end` is slow compared to the batch time (batch time: 0.0060s vs `on_train_batch_end` time: 0.0244s). Check your callbacks. 10000/10000 [==============================] - 77s 8ms/step - loss: 0.2371 - binary_accuracy: 0.8819 - val_loss: 0.2219 - val_binary_accuracy: 0.8820 INFO:tensorflow:Saver not created because there are no variables in the graph to restore WARNING:tensorflow:From /tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/training/tracking/tracking.py:111: Model.state_updates (from tensorflow.python.keras.engine.training) is deprecated and will be removed in a future version. Instructions for updating: This property should not be used in TensorFlow 2.0, as updates are applied automatically. WARNING:tensorflow:From /tmpfs/src/tf_docs_env/lib/python3.6/site-packages/tensorflow/python/training/tracking/tracking.py:111: Layer.updates (from tensorflow.python.keras.engine.base_layer) is deprecated and will be removed in a future version. Instructions for updating: This property should not be used in TensorFlow 2.0, as updates are applied automatically. INFO:tensorflow:Assets written to: /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Trainer/model/6/serving_model_dir/assets INFO:absl:Training complete. Model written to /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Trainer/model/6/serving_model_dir. ModelRun written to /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Trainer/model_run/6 INFO:absl:Running publisher for Trainer INFO:absl:MetadataStore with DB connection initialized
TensorBoard ile Eğitimi Analiz Edin
Eğitmen eserine bir göz atın. Model alt dizinlerini içeren bir dizine işaret eder.
model_artifact_dir = trainer.outputs['model'].get()[0].uri
pp.pprint(os.listdir(model_artifact_dir))
model_dir = os.path.join(model_artifact_dir, 'serving_model_dir')
pp.pprint(os.listdir(model_dir))
['serving_model_dir'] ['variables', 'assets', 'saved_model.pb']
İsteğe bağlı olarak, modelimizin eğitim eğrilerini analiz etmek için TensorBoard'u Trainer'a bağlayabiliriz.
model_run_artifact_dir = trainer.outputs['model_run'].get()[0].uri
%load_ext tensorboard
%tensorboard --logdir {model_run_artifact_dir}
Değerlendirici
Evaluator
bileşeni, değerlendirme kümesi üzerinden model performans ölçütlerini hesaplar. TensorFlow Model Analizi kitaplığını kullanır. Evaluator
ayrıca isteğe bağlı olarak yeni eğitilmiş bir modelin önceki modelden daha iyi olduğunu doğrulayabilir. Bu, bir modeli her gün otomatik olarak eğitip doğrulayabileceğiniz bir üretim hattı ayarında kullanışlıdır. Bu defterde yalnızca bir modeli eğitiyoruz, bu nedenle Evaluator
modeli otomatik olarak "iyi" olarak etiketleyecektir.
Evaluator
girdi olarak veri alacak ExampleGen
gelen eğitimli modeli Trainer
ve dilimleme yapılandırmayı. Dilimleme yapılandırması, metriklerinizi özellik değerlerine göre bölümlere ayırmanıza olanak tanır (örneğin, modeliniz 08: 00-20: 00 arasında başlayan taksi yolculuklarında nasıl performans gösteriyor?). Aşağıdaki bu yapılandırmanın bir örneğine bakın:
eval_config = tfma.EvalConfig(
model_specs=[
# This assumes a serving model with signature 'serving_default'. If
# using estimator based EvalSavedModel, add signature_name: 'eval' and
# remove the label_key.
tfma.ModelSpec(label_key='tips')
],
metrics_specs=[
tfma.MetricsSpec(
# The metrics added here are in addition to those saved with the
# model (assuming either a keras model or EvalSavedModel is used).
# Any metrics added into the saved model (for example using
# model.compile(..., metrics=[...]), etc) will be computed
# automatically.
# To add validation thresholds for metrics saved with the model,
# add them keyed by metric name to the thresholds map.
metrics=[
tfma.MetricConfig(class_name='ExampleCount'),
tfma.MetricConfig(class_name='BinaryAccuracy',
threshold=tfma.MetricThreshold(
value_threshold=tfma.GenericValueThreshold(
lower_bound={'value': 0.5}),
change_threshold=tfma.GenericChangeThreshold(
direction=tfma.MetricDirection.HIGHER_IS_BETTER,
absolute={'value': -1e-10})))
]
)
],
slicing_specs=[
# An empty slice spec means the overall slice, i.e. the whole dataset.
tfma.SlicingSpec(),
# Data can be sliced along a feature column. In this case, data is
# sliced along feature column trip_start_hour.
tfma.SlicingSpec(feature_keys=['trip_start_hour'])
])
Daha sonra bu konfigürasyonu Evaluator
verip çalıştırıyoruz.
# Use TFMA to compute a evaluation statistics over features of a model and
# validate them against a baseline.
# The model resolver is only required if performing model validation in addition
# to evaluation. In this case we validate against the latest blessed model. If
# no model has been blessed before (as in this case) the evaluator will make our
# candidate the first blessed model.
model_resolver = ResolverNode(
instance_name='latest_blessed_model_resolver',
resolver_class=latest_blessed_model_resolver.LatestBlessedModelResolver,
model=Channel(type=Model),
model_blessing=Channel(type=ModelBlessing))
context.run(model_resolver)
evaluator = Evaluator(
examples=example_gen.outputs['examples'],
model=trainer.outputs['model'],
baseline_model=model_resolver.outputs['model'],
# Change threshold will be ignored if there is no baseline (first run).
eval_config=eval_config)
context.run(evaluator)
WARNING:absl:`instance_name` is deprecated, please set node id directly using`with_id()` or `.id` setter. INFO:absl:Running driver for ResolverNode.latest_blessed_model_resolver INFO:absl:MetadataStore with DB connection initialized INFO:absl:Running publisher for ResolverNode.latest_blessed_model_resolver INFO:absl:MetadataStore with DB connection initialized INFO:absl:Running driver for Evaluator INFO:absl:MetadataStore with DB connection initialized INFO:absl:Running executor for Evaluator WARNING:absl:"maybe_add_baseline" and "maybe_remove_baseline" are deprecated, please use "has_baseline" instead. INFO:absl:Request was made to ignore the baseline ModelSpec and any change thresholds. This is likely because a baseline model was not provided: updated_config= model_specs { label_key: "tips" } slicing_specs { } slicing_specs { feature_keys: "trip_start_hour" } metrics_specs { metrics { class_name: "ExampleCount" } metrics { class_name: "BinaryAccuracy" threshold { value_threshold { lower_bound { value: 0.5 } } } } } INFO:absl:Using /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Trainer/model/6/serving_model_dir as model. INFO:absl:The 'example_splits' parameter is not set, using 'eval' split. INFO:absl:Evaluating model. Warning:tensorflow:5 out of the last 5 calls to <function recreate_function.<locals>.restored_function_body at 0x7f1bd8e92620> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/tutorials/customization/performance#python_or_tensor_args and https://www.tensorflow.org/api_docs/python/tf/function for more details. INFO:absl:Evaluation complete. Results written to /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Evaluator/evaluation/8. INFO:absl:Checking validation results. INFO:absl:Blessing result True written to /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Evaluator/blessing/8. INFO:absl:Running publisher for Evaluator INFO:absl:MetadataStore with DB connection initialized
Şimdi Evaluator
çıktı yapıtlarını inceleyelim.
evaluator.outputs
{'evaluation': Channel( type_name: ModelEvaluation artifacts: [Artifact(artifact: id: 10 type_id: 20 uri: "/tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Evaluator/evaluation/8" custom_properties { key: "name" value { string_value: "evaluation" } } custom_properties { key: "producer_component" value { string_value: "Evaluator" } } custom_properties { key: "state" value { string_value: "published" } } state: LIVE , artifact_type: id: 20 name: "ModelEvaluation" )] ), 'blessing': Channel( type_name: ModelBlessing artifacts: [Artifact(artifact: id: 11 type_id: 21 uri: "/tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Evaluator/blessing/8" custom_properties { key: "blessed" value { int_value: 1 } } custom_properties { key: "current_model" value { string_value: "/tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Trainer/model/6" } } custom_properties { key: "current_model_id" value { int_value: 8 } } custom_properties { key: "name" value { string_value: "blessing" } } custom_properties { key: "producer_component" value { string_value: "Evaluator" } } custom_properties { key: "state" value { string_value: "published" } } state: LIVE , artifact_type: id: 21 name: "ModelBlessing" )] )}
evaluation
çıktısını kullanarak, tüm değerlendirme setinde genel ölçümlerin varsayılan görselleştirmesini gösterebiliriz.
context.show(evaluator.outputs['evaluation'])
Dilimlenmiş değerlendirme metriklerinin görselleştirmesini görmek için doğrudan TensorFlow Model Analizi kitaplığını çağırabiliriz.
import tensorflow_model_analysis as tfma
# Get the TFMA output result path and load the result.
PATH_TO_RESULT = evaluator.outputs['evaluation'].get()[0].uri
tfma_result = tfma.load_eval_result(PATH_TO_RESULT)
# Show data sliced along feature column trip_start_hour.
tfma.view.render_slicing_metrics(
tfma_result, slicing_column='trip_start_hour')
SlicingMetricsViewer(config={'weightedExamplesColumn': 'example_count'}, data=[{'slice': 'trip_start_hour:19',…
Bu görselleştirme aynı ölçümleri gösterir, ancak tüm değerlendirme kümesi yerine trip_start_hour
her özellik değerinde hesaplanır.
TensorFlow Model Analysis, Adalet Göstergeleri ve bir model performansının zaman serisini çizme gibi diğer birçok görselleştirmeyi destekler. Daha fazla bilgi edinmek için eğiticiye bakın.
Yapılandırmamıza eşikler eklediğimizden, doğrulama çıktısı da mevcuttur. Bir blessing
eserinin önceliği, modelimizin doğrulamayı geçtiğini gösterir. Bu gerçekleştirilen ilk doğrulama olduğundan, aday otomatik olarak kutsanmaktadır.
blessing_uri = evaluator.outputs.blessing.get()[0].uri
!ls -l {blessing_uri}
total 0 -rw-rw-r-- 1 kbuilder kbuilder 0 Nov 26 10:20 BLESSED
Artık doğrulama sonuç kaydını yükleyerek başarıyı da doğrulayabilir:
PATH_TO_RESULT = evaluator.outputs['evaluation'].get()[0].uri
print(tfma.load_validation_result(PATH_TO_RESULT))
validation_ok: true validation_details { slicing_details { slicing_spec { } num_matching_slices: 25 } }
İtici
Pusher
bileşeni genellikle bir TFX işlem hattının sonundadır. Bir modelin doğrulamayı geçip geçmediğini kontrol eder ve öyleyse modeli _serving_model_dir
.
pusher = Pusher(
model=trainer.outputs['model'],
model_blessing=evaluator.outputs['blessing'],
push_destination=pusher_pb2.PushDestination(
filesystem=pusher_pb2.PushDestination.Filesystem(
base_directory=_serving_model_dir)))
context.run(pusher)
INFO:absl:Running driver for Pusher INFO:absl:MetadataStore with DB connection initialized INFO:absl:Running executor for Pusher INFO:absl:Model version: 1606386023 INFO:absl:Model written to serving path /tmp/tmpogu7x3rf/serving_model/taxi_simple/1606386023. INFO:absl:Model pushed to /tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Pusher/pushed_model/9. INFO:absl:Running publisher for Pusher INFO:absl:MetadataStore with DB connection initialized
Pusher
çıktı yapıtlarını inceleyelim.
pusher.outputs
{'pushed_model': Channel( type_name: PushedModel artifacts: [Artifact(artifact: id: 12 type_id: 23 uri: "/tmp/tfx-interactive-2020-11-26T10_18_19.401905-axkggyw7/Pusher/pushed_model/9" custom_properties { key: "name" value { string_value: "pushed_model" } } custom_properties { key: "producer_component" value { string_value: "Pusher" } } custom_properties { key: "pushed" value { int_value: 1 } } custom_properties { key: "pushed_destination" value { string_value: "/tmp/tmpogu7x3rf/serving_model/taxi_simple/1606386023" } } custom_properties { key: "pushed_version" value { string_value: "1606386023" } } custom_properties { key: "state" value { string_value: "published" } } state: LIVE , artifact_type: id: 23 name: "PushedModel" )] )}
İtici özellikle, modelinizi şu şekilde görünen SavedModel biçiminde dışa aktarır:
push_uri = pusher.outputs.model_push.get()[0].uri
model = tf.saved_model.load(push_uri)
for item in model.signatures.items():
pp.pprint(item)
WARNING:tensorflow:6 out of the last 6 calls to <function recreate_function.<locals>.restored_function_body at 0x7f1606ee5158> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/tutorials/customization/performance#python_or_tensor_args and https://www.tensorflow.org/api_docs/python/tf/function for more details. ('serving_default', <ConcreteFunction signature_wrapper(examples) at 0x7F165C066A90>)
Yerleşik TFX bileşenleri turumuzu bitirdik!