Noise-aware clustering based on maximum correntropy criterion and adaptive graph regularization
Graph-based clustering is a basic subject in the field of machine learning, but most of them still have the following deficiencies. First, similarity graph construction and data division into corresponding classes are always divided into two independent steps. Second, noise contained in real data ma...
Saved in:
Published in | Information sciences Vol. 626; pp. 42 - 59 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Elsevier Inc
01.05.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Graph-based clustering is a basic subject in the field of machine learning, but most of them still have the following deficiencies. First, similarity graph construction and data division into corresponding classes are always divided into two independent steps. Second, noise contained in real data may cause the learned similarity graph to be inaccurate. Third, the traditional metrics based on Euclidean distance is difficult to tackle non-Gaussian noise. In order to eliminate these limitations, a noise-aware clustering based on correntropy and adaptive graph regularization method (NCCAGR) is proposed. 1) In order to change the problem from two-steps to single-step, we formulate a joint clustering learning framework that simultaneously learns a robust similarity graph and performs data clustering; 2) To overcome the influence of noise, we construct a Laplacian matrix and perform adaptive graph regularization based on clean data; 3) By introducing the correntropy to solve the problem of non-Gaussian noise and heavy tail in the original data. Furthermore, a half-quadratic optimization method is used to transform the problem into a quadratic form to facilitate subsequent solutions. Finally, experiments show that the proposed method not only has high performance, but also outperforms both classical methods and state-of-the-art methods in robustness. |
---|---|
ISSN: | 0020-0255 1872-6291 |
DOI: | 10.1016/j.ins.2023.01.024 |