Sample-Based Continuous Approximate Method for Constructing Interval Neural Network

In safety-critical engineering applications, such as robust prediction against adversarial noise, it is necessary to quantify neural networks' uncertainty. Interval neural networks (INNs) are effective models for uncertainty quantification, giving an interval of predictions instead of a single...

Full description

Saved in:
Bibliographic Details
Published inIEEE Transactions on Neural Networks and Learning Systems Vol. 36; no. 4; pp. 5974 - 5987
Main Authors Shen, Xun, Ouyang, Tinghui, Hashimoto, Kazumune, Wu, Yuhu
Format Journal Article
LanguageEnglish
Published United States IEEE 01.04.2025
Institute of Electrical and Electronics Engineers (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In safety-critical engineering applications, such as robust prediction against adversarial noise, it is necessary to quantify neural networks' uncertainty. Interval neural networks (INNs) are effective models for uncertainty quantification, giving an interval of predictions instead of a single value for a corresponding input. This article formulates the problem of training an INN as a chance-constrained optimization problem. The optimal solution of the formulated chance-constrained optimization naturally forms an INN that gives the tightest interval of predictions with a required confidence level. Since the chance-constrained optimization problem is intractable, a sample-based continuous approximate method is used to obtain approximate solutions to the chance-constrained optimization problem. We prove the uniform convergence of the approximation, showing that it gives the optimal INN consistently with the original ones. Additionally, we investigate the reliability of the approximation with finite samples, giving the probability bound for violation with finite samples. Through a numerical example and an application case study of anomaly detection in wind power data, we evaluate the effectiveness of the proposed INN against existing approaches, including Bayesian neural networks, highlighting its capability to significantly improve the performance of applying INNs for regression and unsupervised anomaly detection.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2162-237X
2162-2388
2162-2388
DOI:10.1109/TNNLS.2024.3409379