Robust Support Vector Data Description with Truncated Loss Function for Outliers Depression

Support vector data description (SVDD) is widely regarded as an effective technique for addressing anomaly detection problems. However, its performance can significantly deteriorate when the training data are affected by outliers or mislabeled observations. This study introduces a universal truncate...

Full description

Saved in:
Bibliographic Details
Published inEntropy (Basel, Switzerland) Vol. 26; no. 8; p. 628
Main Authors Chen, Huakun, Lyu, Yongxi, Shi, Jingping, Zhang, Weiguo
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 25.07.2024
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Support vector data description (SVDD) is widely regarded as an effective technique for addressing anomaly detection problems. However, its performance can significantly deteriorate when the training data are affected by outliers or mislabeled observations. This study introduces a universal truncated loss function framework into the SVDD model to enhance its robustness and employs the fast alternating direction method of multipliers (ADMM) algorithm to solve various truncated loss functions. Moreover, the convergence of the fast ADMM algorithm is analyzed theoretically. Within this framework, we developed the truncated generalized ramp, truncated binary cross entropy, and truncated linear exponential loss functions for SVDD. We conducted extensive experiments on synthetic and real-world datasets to validate the effectiveness of these three SVDD models in handling data with different noise levels, demonstrating their superior robustness and generalization capabilities compared to other SVDD models.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1099-4300
1099-4300
DOI:10.3390/e26080628