TF 2.0 is out! Get hands-on practice at TF World, Oct 28-31. Use code TF20 for 20% off select passes. Register now

tfp.stats.expected_calibration_error

View source on GitHub

Compute the Expected Calibration Error (ECE).

tfp.stats.expected_calibration_error(
    num_bins,
    logits=None,
    labels_true=None,
    labels_predicted=None,
    name=None
)

This method implements equation (3) in [1]. In this equation the probability of the decided label being correct is used to estimate the calibration property of the predictor.

References

[1]: Chuan Guo, Geoff Pleiss, Yu Sun, Kilian Q. Weinberger, On Calibration of Modern Neural Networks. Proceedings of the 34th International Conference on Machine Learning (ICML 2017). arXiv:1706.04599 https://arxiv.org/pdf/1706.04599.pdf

Args:

  • num_bins: int, number of probability bins, e.g. 10.
  • logits: Tensor, (n,nlabels), with logits for n instances and nlabels.
  • labels_true: Tensor, (n,), with tf.int32 or tf.int64 elements containing ground truth class labels in the range [0,nlabels].
  • labels_predicted: Tensor, (n,), with tf.int32 or tf.int64 elements containing decisions of the predictive system. If None, we will use the argmax decision using the logits.
  • name: Python str name prefixed to Ops created by this function.

Returns:

  • ece: Tensor, scalar, tf.float32.