Self-perceptive feature fusion network with multi-channel graph convolution for brain disorder diagnosis
Current brain disorder diagnostic approaches are constrained by a single template or a single modality, neglecting the potential correlations between multi-scale features and the importance of non-imaging data. It results in inefficiently extraction of discriminative features from brain functional c...
Saved in:
Published in | Expert systems with applications Vol. 284; p. 127984 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier Ltd
25.07.2025
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Current brain disorder diagnostic approaches are constrained by a single template or a single modality, neglecting the potential correlations between multi-scale features and the importance of non-imaging data. It results in inefficiently extraction of discriminative features from brain functional connectivity networks (BFCNs), and fails to inaccurately establish inter-subject associations when relying solely on non-imaging data. To address these issues, we proposed a novel self-perceptive feature fusion network with multi-channel graph convolution (MCGC-SPFFN) for brain disorders. Specifically, BFCNs were constructed with multi-template data to extract multi-scale features. A MGMC module was designed to explore inter-subject similarities based on phenotypic data and complementary information across distinct templates. It consisted of an adaptive edge learning network (AELN) with a parameter-sharing strategy. The multi-channel graph convolutional network (GCN) aggregated the node features. Furthermore, a self-perceptive feature fusion (SPFF) module was designed to fuse the features by the accuracy-weighted voting strategy and the multi-head cross-attention mechanism. The channel diversity and scale correlation constraints were implemented to thoroughly investigate the latent relationships among features. Experimental results show it achieves an accuracy of 81.2% for autism spectrum disorder (ASD) and an accuracy of 60.1% for major depressive disorder (MDD). It was validated that MCGC-SPFFN can simultaneously extract features from multi-template and multi-modality data, and outperformed some advanced methods. The source code for MCGC-SPFFN is available at https://github.com/XL-Jiang/MCGC-SPFFN. |
---|---|
ISSN: | 0957-4174 |
DOI: | 10.1016/j.eswa.2025.127984 |