Evaluating Uncertainty Calibration for Open-Set Recognition
Despite achieving enormous success in predictive accuracy for visual classification problems, deep neural networks (DNNs) suffer from providing overconfident probabilities on out-of-distribution (OOD) data. Yet, accurate uncertainty estimation is crucial for safe and reliable robot autonomy. In this...
Saved in:
Main Authors | , , |
---|---|
Format | Journal Article |
Language | English |
Published |
14.05.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Despite achieving enormous success in predictive accuracy for visual
classification problems, deep neural networks (DNNs) suffer from providing
overconfident probabilities on out-of-distribution (OOD) data. Yet, accurate
uncertainty estimation is crucial for safe and reliable robot autonomy. In this
paper, we evaluate popular calibration techniques for open-set conditions in a
way that is distinctly different from the conventional evaluation of
calibration methods on OOD data. Our results show that closed-set DNN
calibration approaches are much less effective for open-set recognition, which
highlights the need to develop new DNN calibration methods to address this
problem. |
---|---|
DOI: | 10.48550/arxiv.2205.07160 |