tfma.metrics.PrecisionAtRecall

Computes best precision where recall is >= specified value.

Inherits From: Metric

The threshold for the given recall value is computed and used to evaluate the corresponding precision.

If sample_weight is None, weights default to 1. Use sample_weight of 0 to mask values.

recall A scalar or a list of scalar values in range [0, 1].
thresholds (Optional) Thresholds to use for calculating the matrices. Use one of either thresholds or num_thresholds.
num_thresholds (Optional) Defaults to 1000. The number of thresholds to use for matching the given recall.
class_id (Optional) Used with a multi-class model to specify which class to compute the confusion matrix for. When class_id is used, metrics_specs.binarize settings must not be present. Only one of class_id or top_k should be configured.
name (Optional) string name of the metric instance.
top_k (Optional) Used with a multi-class model to specify that the top-k values should be used to compute the confusion matrix. The net effect is that the non-top-k values are set to -inf and the matrix is then constructed from the average TP, FP, TN, FN across the classes. When top_k is used, metrics_specs.binarize settings must not be present. Only one of class_id or top_k should be configured. When top_k is set, the default thresholds are [float('-inf')].
**kwargs (Optional) Additional args to pass along to init (and eventually on to _metric_computation and _metric_value)

compute_confidence_interval Whether to compute confidence intervals for this metric.

Note that this may not completely remove the computational overhead involved in computing a given metric. This is only respected by the jackknife confidence interval method.

Methods

computations

View source

Creates computations associated with metric.

from_config

View source

get_config

View source

Returns serializable config.