ML Community Day is November 9! Join us for updates from TensorFlow, JAX, and more Learn more

tf.keras.metrics.MeanAbsolutePercentageError

Computes the mean absolute percentage error between y_true and y_pred.

Inherits From: MeanMetricWrapper, Mean, Metric, Layer, Module

Used in the notebooks

Used in the guide

name (Optional) string name of the metric instance.
dtype (Optional) data type of the metric result.

Standalone usage:

m = tf.keras.metrics.MeanAbsolutePercentageError()
m.update_state([[0, 1], [0, 0]], [[1, 1], [0, 0]])
m.result().numpy()
250000000.0
m.reset_state()
m.update_state([[0, 1], [0, 0]], [[1, 1], [0, 0]],
               sample_weight=[1, 0])
m.result().numpy()
500000000.0

Usage with compile() API:

model.compile(
    optimizer='sgd',
    loss='mse',
    metrics=[tf.keras.metrics.MeanAbsolutePercentageError()])

Methods

reset_state

View source

Resets all of the metric state variables.

This function is called between epochs/steps, when a metric is evaluated during training.

result

View source

Computes and returns the metric value tensor.

Result computation is an idempotent operation that simply calculates the metric value using the state variables.

update_state

View source

Accumulates metric statistics.

For sparse categorical metrics, the shapes of y_true and y_pred are different.

Args
y_true Ground truth label values. shape = [batch_size, d0, .. dN-1] or shape = [batch_size, d0, .. dN-1, 1].