Recalibration of Aleatoric and Epistemic Regression Uncertainty in Medical Imaging
The consideration of predictive uncertainty in medical imaging with deep learning is of utmost importance. We apply estimation of both aleatoric and epistemic uncertainty by variational Bayesian inference with Monte Carlo dropout to regression tasks and show that predictive uncertainty is systematic...
Saved in:
Main Authors | , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
26.04.2021
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The consideration of predictive uncertainty in medical imaging with deep
learning is of utmost importance. We apply estimation of both aleatoric and
epistemic uncertainty by variational Bayesian inference with Monte Carlo
dropout to regression tasks and show that predictive uncertainty is
systematically underestimated. We apply $ \sigma $ scaling with a single scalar
value; a simple, yet effective calibration method for both types of
uncertainty. The performance of our approach is evaluated on a variety of
common medical regression data sets using different state-of-the-art
convolutional network architectures. In our experiments, $ \sigma $ scaling is
able to reliably recalibrate predictive uncertainty. It is easy to implement
and maintains the accuracy. Well-calibrated uncertainty in regression allows
robust rejection of unreliable predictions or detection of out-of-distribution
samples. Our source code is available at
https://github.com/mlaves/well-calibrated-regression-uncertainty |
---|---|
DOI: | 10.48550/arxiv.2104.12376 |