Learning on sample-efficient and label-efficient multi-view cardiac data with graph transformer

Predicting cardiovascular disease has been a challenging task, as assessing samples based on a single view of information may be insufficient. Therefore, in this paper, we focus on the challenge of predicting cardiovascular disease using multi-view cardiac data. However, multi-view cardiac data is u...

Full description

Saved in:
Bibliographic Details
Published inPattern recognition letters Vol. 180; pp. 127 - 133
Main Authors Wang, Lujing, Ma, Yunting, Zhang, Wanqiu, Zhao, Xiaoying, Zhao, Xinxiang
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.04.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Predicting cardiovascular disease has been a challenging task, as assessing samples based on a single view of information may be insufficient. Therefore, in this paper, we focus on the challenge of predicting cardiovascular disease using multi-view cardiac data. However, multi-view cardiac data is usually difficult to collect and label. Based on this motivation, learning an effective predictive model on sample-efficient and label-efficient multi-view cardiac data is urgently needed. To address the aforementioned issues, we propose a multi-view learning method: (i) our method utilizes graph learning to establish and extract relationships between data, enabling learning from a small number of labeled data and a small number of samples; (ii) our method integrates features from multiple views to utilize complementary information in the data; (iii) for data without a provided graph of relationships between samples, we utilize the mechanism of transformers to learn the relationships between samples in a data-driven manner. We validate the effectiveness of our method on real heart disease datasets. •Our method considers multi-view cardiac data to provide comprehensive and accurate information for diagnosis.•Our method overcomes the limitations of sample-efficient and label-efficient data.•Our method captures global relationships between subjects and achieves high diagnostic accuracy.
ISSN:0167-8655
1872-7344
DOI:10.1016/j.patrec.2024.03.001