|View source on GitHub|
Context manager for tensorflow-transform.
tft_beam.Context( temp_dir: Optional[str] = None, desired_batch_size: Optional[int] = None, passthrough_keys: Optional[Iterable[str]] = None, use_deep_copy_optimization: Optional[bool] = None, use_tfxio: Any = _DEPRECATED_SENTINEL, force_tf_compat_v1: Optional[bool] = None )
All the attributes in this context are kept on a thread local state. Note that the temp dir should be accessible to worker jobs, e.g. if running with the Cloud Dataflow runner, the temp dir should be on GCS and should have permissions that allow both launcher and workers to access it.
||(Optional) The temporary directory used within in this block.|
||(Optional) A batch size to batch elements by. If not provided, a batch size will be computed automatically.|
||(Optional) A set of strings that are keys to instances that should pass through the pipeline and be hidden from the preprocessing_fn. This should only be used in cases where additional information should be attached to instances in the pipeline which should not be part of the transformation graph, instance keys is one such example.|
||(Optional) If True, makes deep copies of PCollections that are used in multiple TFT phases.|
||Deprecated. Do not set.|
(Optional) If True, TFT's public APIs
(e.g. AnalyzeDataset) will use Tensorflow in compat.v1 mode irrespective
of installed version of Tensorflow. Defaults to
create_base_temp_dir() -> str
Generate a temporary location.
get_desired_batch_size() -> Optional[int]
Retrieves a user set fixed batch size, None if not set.
get_passthrough_keys() -> Iterable[str]
Retrieves a user set passthrough_keys, None if not set.
get_use_deep_copy_optimization() -> bool
Retrieves a user set use_deep_copy_optimization, None if not set.
get_use_tf_compat_v1() -> bool
Computes use_tf_compat_v1 from TF environment and force_tf_compat_v1.
__exit__( *exn_info )