Low-Rank Tensor Based Proximity Learning for Multi-View Clustering
Graph-oriented multi-view clustering methods have achieved impressive performances by employing relationships and complex structures hidden in multi-view data. However, most of them still suffer from the following two common problems. (1) They target at studying a common representation or pairwise c...
Saved in:
Published in | IEEE transactions on knowledge and data engineering Vol. 35; no. 5; pp. 5076 - 5090 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.05.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Graph-oriented multi-view clustering methods have achieved impressive performances by employing relationships and complex structures hidden in multi-view data. However, most of them still suffer from the following two common problems. (1) They target at studying a common representation or pairwise correlations between views, neglecting the comprehensiveness and deeper higher-order correlations among multiple views. (2) The prior knowledge of view-specific representation can not be taken into account to obtain the consensus indicator graph in a unified graph construction and clustering framework. To deal with these problems, we propose a novel Low-rank Tensor Based Proximity Learning (LTBPL) approach for multi-view clustering, where multiple low-rank probability affinity matrices and consensus indicator graph reflecting the final performances are jointly studied in a unified framework. Specifically, multiple affinity representations are stacked in a low-rank constrained tensor to recover their comprehensiveness and higher-order correlations. Meanwhile, view-specific representation carrying different adaptive confidences is jointly linked with the consensus indicator graph. Extensive experiments on nine real-world datasets indicate the superiority of LTBPL compared with the state-of-the-art methods. |
---|---|
ISSN: | 1041-4347 1558-2191 |
DOI: | 10.1109/TKDE.2022.3151861 |