Adjusting Regression Models for Conditional Uncertainty Calibration

Conformal Prediction methods have finite-sample distribution-free marginal coverage guarantees. However, they generally do not offer conditional coverage guarantees, which can be important for high-stakes decisions. In this paper, we propose a novel algorithm to train a regression function to improv...

Full description

Saved in:
Bibliographic Details
Main Authors Gao, Ruijiang, Yin, Mingzhang, McInerney, James, Kallus, Nathan
Format Journal Article
LanguageEnglish
Published 25.09.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Conformal Prediction methods have finite-sample distribution-free marginal coverage guarantees. However, they generally do not offer conditional coverage guarantees, which can be important for high-stakes decisions. In this paper, we propose a novel algorithm to train a regression function to improve the conditional coverage after applying the split conformal prediction procedure. We establish an upper bound for the miscoverage gap between the conditional coverage and the nominal coverage rate and propose an end-to-end algorithm to control this upper bound. We demonstrate the efficacy of our method empirically on synthetic and real-world datasets.
DOI:10.48550/arxiv.2409.17466