Information-theoretic Generalization Analysis for Expected Calibration Error
While the expected calibration error (ECE), which employs binning, is widely adopted to evaluate the calibration performance of machine learning models, theoretical understanding of its estimation bias is limited. In this paper, we present the first comprehensive analysis of the estimation bias in t...
Saved in:
Main Authors | , |
---|---|
Format | Journal Article |
Language | English |
Published |
24.05.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | While the expected calibration error (ECE), which employs binning, is widely
adopted to evaluate the calibration performance of machine learning models,
theoretical understanding of its estimation bias is limited. In this paper, we
present the first comprehensive analysis of the estimation bias in the two
common binning strategies, uniform mass and uniform width binning. Our analysis
establishes upper bounds on the bias, achieving an improved convergence rate.
Moreover, our bounds reveal, for the first time, the optimal number of bins to
minimize the estimation bias. We further extend our bias analysis to
generalization error analysis based on the information-theoretic approach,
deriving upper bounds that enable the numerical evaluation of how small the ECE
is for unknown data. Experiments using deep learning models show that our
bounds are nonvacuous thanks to this information-theoretic generalization
analysis approach. |
---|---|
DOI: | 10.48550/arxiv.2405.15709 |