|View source on GitHub|
Wraps a stateless metric function with the Mean metric.
See Migration guide for more details.
tf.keras.metrics.MeanMetricWrapper( fn, name=None, dtype=None, **kwargs )
You could use this class to quickly build a mean metric from a function. The
function needs to have the signature
fn(y_true, y_pred) and return a
per-sample loss array.
MeanMetricWrapper.result() will return
the average metric value across all samples seen so far.
def accuracy(y_true, y_pred): return tf.cast(tf.math.equal(y_true, y_pred), tf.float32) accuracy_metric = tf.keras.metrics.MeanMetricWrapper(fn=accuracy) keras_model.compile(..., metrics=accuracy_metric)
The metric function to wrap, with signature
||(Optional) string name of the metric instance.|
||(Optional) data type of the metric result.|
Keyword arguments to pass on to
merge_state( metrics )
Merges the state from one or more metrics.
This method can be used by distributed systems to merge the state computed by different metric instances. Typically the state will be stored in the form of the metric's weights. For example, a tf.keras.metrics.Mean metric contains a list of two weight values: a total and a count. If there were two instances of a tf.keras.metrics.Accuracy that each independently aggregated partial state for an overall accuracy calculation, these two metric's states could be combined as follows:
m1 = tf.keras.metrics.Accuracy()
_ = m1.update_state([, ], [, ])
m2 = tf.keras.metrics.Accuracy()
_ = m2.update_state([, ], [, ])
||an iterable of metrics. The metrics must have compatible state.|
||If the provided iterable does not contain metrics matching the metric's required specifications.|
Resets all of the metric state variables.
This function is called between epochs/steps, when a metric is evaluated during training.
Computes and returns the metric value tensor.
Result computation is an idempotent operation that simply calculates the metric value using the state variables.
update_state( y_true, y_pred, sample_weight=None )
Accumulates metric statistics.
For sparse categorical metrics, the shapes of
Ground truth label values. shape =
The predicted probability values. shape =