Evaluating Uncertainty Calibration for Open-Set Recognition

Despite achieving enormous success in predictive accuracy for visual classification problems, deep neural networks (DNNs) suffer from providing overconfident probabilities on out-of-distribution (OOD) data. Yet, accurate uncertainty estimation is crucial for safe and reliable robot autonomy. In this...

Full description

Saved in:
Bibliographic Details
Main Authors Lyu, Zongyao, Gutierrez, Nolan B, Beksi, William J
Format Journal Article
LanguageEnglish
Published 14.05.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Despite achieving enormous success in predictive accuracy for visual classification problems, deep neural networks (DNNs) suffer from providing overconfident probabilities on out-of-distribution (OOD) data. Yet, accurate uncertainty estimation is crucial for safe and reliable robot autonomy. In this paper, we evaluate popular calibration techniques for open-set conditions in a way that is distinctly different from the conventional evaluation of calibration methods on OOD data. Our results show that closed-set DNN calibration approaches are much less effective for open-set recognition, which highlights the need to develop new DNN calibration methods to address this problem.
DOI:10.48550/arxiv.2205.07160