Missed TensorFlow Dev Summit? Check out the video playlist. Watch recordings

tfx.components.Evaluator

View source on GitHub

A TFX component to evaluate models trained by a TFX Trainer component.

Inherits From: BaseComponent

tfx.components.Evaluator(
    examples=None, model=None, baseline_model=None, feature_slicing_spec=None,
    fairness_indicator_thresholds=None, output=None, model_exports=None,
    instance_name=None, eval_config=None, blessing=None
)

Used in the notebooks

Used in the tutorials

The Evaluator component performs model evaluations in the TFX pipeline and the resultant metrics can be viewed in a Jupyter notebook. It uses the input examples generated from the ExampleGen component to evaluate the models.

Specifically, it can provide: - metrics computed on entire training and eval dataset - tracking metrics over time - model quality performance on different feature slices

Exporting the EvalSavedModel in Trainer

In order to setup Evaluator in a TFX pipeline, an EvalSavedModel needs to be exported during training, which is a special SavedModel containing annotations for the metrics, features, labels, and so on in your model. Evaluator uses this EvalSavedModel to compute metrics.

As part of this, the Trainer component creates eval_input_receiver_fn, analogous to the serving_input_receiver_fn, which will extract the features and labels from the input data. As with serving_input_receiver_fn, there are utility functions to help with this.

Please see https://www.tensorflow.org/tfx/model_analysis for more details.

Example

  # Uses TFMA to compute a evaluation statistics over features of a model.
  model_analyzer = Evaluator(
      examples=example_gen.outputs['examples'],
      model=trainer.outputs['model'],
      eval_config=tfma.EvalConfig(...))

Args:

  • examples: A Channel of type standard_artifacts.Examples, usually produced by an ExampleGen component. required
  • model: A Channel of type standard_artifacts.Model, usually produced by a Trainer component.
  • baseline_model: An optional channel of type 'standard_artifacts.Model' as the baseline model for model diff and model validation purpose.
  • feature_slicing_spec: Deprecated, please use eval_config instead. Only support estimator. evaluator_pb2.FeatureSlicingSpec instance that describes how Evaluator should slice the data. If any field is provided as a RuntimeParameter, feature_slicing_spec should be constructed as a dict with the same field names as FeatureSlicingSpec proto message.
  • fairness_indicator_thresholds: Optional list of float (or RuntimeParameter) threshold values for use with TFMA fairness indicators. Experimental functionality: this interface and functionality may change at any time. TODO(b/142653905): add a link to additional documentation for TFMA fairness indicators here.
  • output: Channel of ModelEvalPath to store the evaluation results.
  • model_exports: Backwards compatibility alias for the model argument.
  • instance_name: Optional name assigned to this specific instance of Evaluator. Required only if multiple Evaluator components are declared in the same pipeline. Either model_exports or model must be present in the input arguments.
  • eval_config: Instance of tfma.EvalConfig containg configuration settings for running the evaluation. This config has options for both estimator and Keras.
  • blessing: Output channel of 'ModelBlessingPath' that contains the blessing result.

Attributes:

  • component_id: DEPRECATED FUNCTION

  • component_type: DEPRECATED FUNCTION

  • downstream_nodes

  • exec_properties

  • id: Node id, unique across all TFX nodes in a pipeline.

    If instance name is available, node_id will be: . otherwise, node_id will be:

  • inputs

  • outputs

  • type

  • upstream_nodes

Child Classes

class DRIVER_CLASS

class SPEC_CLASS

Methods

add_downstream_node

View source

add_downstream_node(
    downstream_node
)

add_upstream_node

View source

add_upstream_node(
    upstream_node
)

from_json_dict

View source

@classmethod
from_json_dict(
    cls, dict_data
)

Convert from dictionary data to an object.

get_id

View source

@classmethod
get_id(
    cls, instance_name=None
)

Gets the id of a node.

This can be used during pipeline authoring time. For example: from tfx.components import Trainer

resolver = ResolverNode(..., model=Channel( type=Model, producer_component_id=Trainer.get_id('my_trainer')))

Args:

  • instance_name: (Optional) instance name of a node. If given, the instance name will be taken into consideration when generating the id.

Returns:

an id for the node.

to_json_dict

View source

to_json_dict()

Convert from an object to a JSON serializable dictionary.

Class Variables

  • EXECUTOR_SPEC