Low-Rank Tensor Based Proximity Learning for Multi-View Clustering

Graph-oriented multi-view clustering methods have achieved impressive performances by employing relationships and complex structures hidden in multi-view data. However, most of them still suffer from the following two common problems. (1) They target at studying a common representation or pairwise c...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on knowledge and data engineering Vol. 35; no. 5; pp. 5076 - 5090
Main Authors Chen, Man-Sheng, Wang, Chang-Dong, Lai, Jian-Huang
Format Journal Article
LanguageEnglish
Published New York IEEE 01.05.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Graph-oriented multi-view clustering methods have achieved impressive performances by employing relationships and complex structures hidden in multi-view data. However, most of them still suffer from the following two common problems. (1) They target at studying a common representation or pairwise correlations between views, neglecting the comprehensiveness and deeper higher-order correlations among multiple views. (2) The prior knowledge of view-specific representation can not be taken into account to obtain the consensus indicator graph in a unified graph construction and clustering framework. To deal with these problems, we propose a novel Low-rank Tensor Based Proximity Learning (LTBPL) approach for multi-view clustering, where multiple low-rank probability affinity matrices and consensus indicator graph reflecting the final performances are jointly studied in a unified framework. Specifically, multiple affinity representations are stacked in a low-rank constrained tensor to recover their comprehensiveness and higher-order correlations. Meanwhile, view-specific representation carrying different adaptive confidences is jointly linked with the consensus indicator graph. Extensive experiments on nine real-world datasets indicate the superiority of LTBPL compared with the state-of-the-art methods.
ISSN:1041-4347
1558-2191
DOI:10.1109/TKDE.2022.3151861