Sensors (Jul 2022)

Evaluating and Calibrating Uncertainty Prediction in Regression Tasks

  • Dan Levi,
  • Liran Gispan,
  • Niv Giladi,
  • Ethan Fetaya

DOI
https://doi.org/10.3390/s22155540
Journal volume & issue
Vol. 22, no. 15
p. 5540

Abstract

Read online

Predicting not only the target but also an accurate measure of uncertainty is important for many machine learning applications, and in particular, safety-critical ones. In this work, we study the calibration of uncertainty prediction for regression tasks which often arise in real-world systems. We show that the existing definition for the calibration of regression uncertainty has severe limitations in distinguishing informative from non-informative uncertainty predictions. We propose a new definition that escapes this caveat and an evaluation method using a simple histogram-based approach. Our method clusters examples with similar uncertainty prediction and compares the prediction with the empirical uncertainty on these examples. We also propose a simple, scaling-based calibration method that preforms as well as much more complex ones. We show results on both a synthetic, controlled problem and on the object detection bounding-box regression task using the COCO and KITTI datasets.

Keywords