Missed TensorFlow Dev Summit? Check out the video playlist. Watch recordings

tfx.components.infra_validator.serving_bins.TensorFlowServing

View source on GitHub

TensorFlow Serving binary.

Inherits From: ServingBinary

tfx.components.infra_validator.serving_bins.TensorFlowServing(
    model_name, tag=None, digest=None
)

Attributes:

  • container_port: Container port of the model server.

    Only applies to docker compatible serving binaries.

  • image: Container image of the model server.

    Only applies to docker compatible serving binaries.

Methods

MakeClient

View source

MakeClient(
    endpoint
)

Create a model server client of this serving binary.

MakeDockerRunParams

View source

MakeDockerRunParams(
    host_port, model_base_path=None, host_model_path=None
)

Make parameters for docker client.containers.run.

Args:

  • host_port: Available port in the host to bind with container port.
  • model_base_path: (Optional) Model base path for the tensorflow serving. If the model is exported to the remote destination, you should specify its location (e.g. gs://your_bucket/model_base_path) and gfile will recognize it. If your model is in the local host machine, do not alter model_base_path (i.e. use default value /model) and use host_model_path argument to configure a volume mount from a host machine to the container.
  • host_model_path: (Optional) host path for exported model. Use this only if you have an exported SavedModel in the local host machine. Using this option will create a volume mount from host_model_path to the {model_base_path}/{model_name}.

Returns:

A dictionary of docker run parameters.

MakeEnvVars

View source

MakeEnvVars(
    model_base_path=None
)

Construct environment variables to be used in container image.

Only applies to docker compatible serving binaries.

Args:

  • *args: List of unresolved variables to configure environment variables.

Returns:

A dictionary of environment variables inside container.