|View source on GitHub|
Relative coefficient of discrimination metric.
tfma.metrics.RelativeCoefficientOfDiscrimination( name=RELATIVE_COEFFICIENT_OF_DISCRIMINATION_NAME )
The relative coefficient of discrimination measures the ratio between the average prediction for the positive examples and the average prediction for the negative examples. This has a very simple intuitive explanation, measuring how much higher is the prediction going to be for a positive example than for a negative example.
Whether to compute confidence intervals for this metric.
Note that this may not completely remove the computational overhead involved in computing a given metric. This is only respected by the jackknife confidence interval method.
computations( eval_config: Optional[
tfma.EvalConfig] = None, schema: Optional[schema_pb2.Schema] = None, model_names: Optional[List[Text]] = None, output_names: Optional[List[Text]] = None, sub_keys: Optional[List[Optional[SubKey]]] = None, aggregation_type: Optional[AggregationType] = None, class_weights: Optional[Dict[int, float]] = None, query_key: Optional[Text] = None, is_diff: Optional[bool] = False ) ->
Creates computations associated with metric.
get_config() -> Dict[Text, Any]
Returns serializable config.