Subspace Modeling for Fast Out-Of-Distribution and Anomaly Detection

This paper presents a fast, principled approach for detecting anomalous and out-of-distribution (OOD) samples in deep neural networks (DNN). We propose the application of linear statistical dimensionality reduction techniques on the semantic features produced by a DNN, in order to capture the low-di...

Full description

Saved in:
Bibliographic Details
Published in2022 IEEE International Conference on Image Processing (ICIP) pp. 3041 - 3045
Main Authors Ndiour, Ibrahima J., Ahuja, Nilesh A., Tickoo, Omesh
Format Conference Proceeding
LanguageEnglish
Published IEEE 16.10.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper presents a fast, principled approach for detecting anomalous and out-of-distribution (OOD) samples in deep neural networks (DNN). We propose the application of linear statistical dimensionality reduction techniques on the semantic features produced by a DNN, in order to capture the low-dimensional subspace truly spanned by said features. We show that the feature reconstruction error (FRE), which is the ℓ 2 -norm of the difference between the original feature in the high-dimensional space and the pre-image of its low-dimensional reduced embedding, is highly effective for OOD and anomaly detection. To generalize to intermediate features produced at any given layer, we extend the methodology by applying nonlinear kernel-based methods. Experiments using standard image datasets and DNN architectures demonstrate that our method meets or exceeds best-in-class quality performance, but at a fraction of the computational and memory cost required by the state of the art. It can be trained and run very efficiently, even on a traditional CPU.
ISSN:2381-8549
DOI:10.1109/ICIP46576.2022.9897694