|View source on GitHub|
Local generic trainer executor for the TFX Trainer component.
tfx.components.trainer.executor.GenericExecutor( context: Optional[
tfx.dsl.components.base.base_executor.BaseExecutor.Context] = None )
The Trainer executor supplements TensorFlow training with a component to enable warm-start training of any user-specified TF model. The Trainer is a library built on top of TensorFlow that is expected to be integrated into a custom user-specified binary.
To include Trainer in a TFX pipeline, configure your pipeline similar to https://github.com/tensorflow/tfx/blob/master/tfx/examples/chicago_taxi_pipeline/taxi_pipeline_simple.py#L104
How to create a trainer callback function to be used by this Trainer executor: A model training can be executed by TFX by first creating a run_fn callback method that defines, trains an TF Model and saves it to the provided location, This becomes the basis of the Executor for GenericTrainer. This Executor will then execute the run_fn with correct parameters by resolving the input artifacts, output artifacts and execution properties.
Do( input_dict: Dict[Text, List[
tfx.types.Artifact]], output_dict: Dict[Text, List[
tfx.types.Artifact]], exec_properties: Dict[Text, Any] ) -> None
Uses a user-supplied run_fn to train a TensorFlow model locally.
The Trainer Executor invokes a run_fn callback function provided by the user via the module_file parameter. In this function, user defines the model and trains it, then saves the model and training related files (e.g, Tensorboard logs) to the provided locations.
Input dict from input key to a list of ML-Metadata Artifacts.
Output dict from output key to a list of Artifacts.
A dict of execution properties.
||When neither or both of 'module_file' and 'run_fn' are present in 'exec_properties'.|
||If run_fn failed to generate model in desired location.|