tfg.nn.metric.fscore.evaluate

Computes the fscore metric for the given ground truth and predicted labels.

The fscore is calculated as 2 * (precision * recall) / (precision + recall) where the precision and recall are evaluated by the given function parameters. The precision and recall functions default to their definition for boolean labels (see https://en.wikipedia.org/wiki/Precision_and_recall for more details).

In the following, A1 to An are optional batch dimensions, which must be broadcast compatible.

ground_truth A tensor of shape [A1, ..., An, N], where the last axis represents the ground truth values.
prediction A tensor of shape [A1, ..., An, N], where the last axis represents the predicted values.
precision_function The function to use for evaluating the precision. Defaults to the precision evaluation for binary ground-truth and predictions.
recall_function The function to use for evaluating the recall. Defaults to the recall evaluation for binary ground-truth and prediction.
name A name for this op. Defaults to "fscore_evaluate".

A tensor of shape [A1, ..., An] that stores the fscore metric for the given ground truth labels and predictions.

ValueError if the shape of ground_truth, prediction is not supported.