OMASGAN: Out-of-distribution Minimum Anomaly Score GAN for Anomaly Detection
Generative models trained in an unsupervised manner may set high likelihood and low reconstruction loss to Out-of-Distribution (OoD) samples. This leads to failures to detect anomalies, overall decreasing Anomaly Detection (AD) performance. In addition, AD models underperform due to the rarity of an...
Saved in:
Published in | 2022 Sensor Signal Processing for Defence Conference (SSPD) pp. 1 - 5 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.09.2022
|
Subjects | |
Online Access | Get full text |
DOI | 10.1109/SSPD54131.2022.9896220 |
Cover
Summary: | Generative models trained in an unsupervised manner may set high likelihood and low reconstruction loss to Out-of-Distribution (OoD) samples. This leads to failures to detect anomalies, overall decreasing Anomaly Detection (AD) performance. In addition, AD models underperform due to the rarity of anomalies. To address these limitations, we develop the OoD Minimum Anomaly Score GAN (OMASGAN) which performs retraining by including the proposed minimum-anomaly-score OoD samples. These OoD samples are generated on the boundary of the support of the normal class data distribution in a proposed self-supervised learning manner. Our OMASGAN retraining algorithm leads to more accurate estimation of the underlying data distribution including multimodal supports and also disconnected modes. For inference, for AD, we devise a discriminator which is trained with negative and positive samples either generated (negative or positive) or real (only positive). The evaluation of OMASGAN on image data using the leave-one-out method shows that it achieves an improvement of at least 0.24 and 0.07 points in AUROC on average on the MNIST and CIFAR-10 datasets, respectively, over other benchmark models for AD. |
---|---|
DOI: | 10.1109/SSPD54131.2022.9896220 |