Beyond the Norms: Detecting Prediction Errors in Regression Models - Ifsttar Access content directly
Conference Papers Year : 2024

Beyond the Norms: Detecting Prediction Errors in Regression Models


This paper tackles the challenge of detecting unreliable behavior in regression algorithms, which may arise from intrinsic variability (e.g., aleatoric uncertainty) or modeling errors (e.g., model uncertainty). First, we formally introduce the notion of unreliability in regression, i.e., when the output of the regressor exceeds a specified discrepancy (or error). Then, using powerful tools for probabilistic modeling, we estimate the discrepancy density, and we measure its statistical diversity using our proposed metric for statistical dissimilarity. In turn, this allows us to derive a data-driven score that expresses the uncertainty of the regression outcome. We show empirical improvements in error detection for multiple regression tasks, consistently outperforming popular baseline approaches, and contributing to the broader field of uncertainty quantification and safe machine learning systems.
Fichier principal
Vignette du fichier
ICML2024_Safe_Regression-6.pdf (1.05 Mo) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-04575936 , version 1 (31-05-2024)


  • HAL Id : hal-04575936 , version 1


Andres Altieri, Marco Romanelli, Pichler Georg, Florence Alberge, Pablo Piantanida. Beyond the Norms: Detecting Prediction Errors in Regression Models. Forty-first International Conference on Machine Learning (ICML 2024), Jul 2024, Vienna, Austria. ⟨hal-04575936⟩
181 View
25 Download


Gmail Mastodon Facebook X LinkedIn More