MS²-GNN: Exploring GNN-Based Multimodal Fusion Network for Depression Detection
Major depressive disorder (MDD) is one of the most common and severe mental illnesses, posing a huge burden on society and families. Recently, some multimodal methods have been proposed to learn a multimodal embedding for MDD detection and achieved promising performance. However, these methods ignor...
Saved in:
Published in | IEEE transactions on cybernetics Vol. 53; no. 12; pp. 7749 - 7759 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
01.12.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Major depressive disorder (MDD) is one of the most common and severe mental illnesses, posing a huge burden on society and families. Recently, some multimodal methods have been proposed to learn a multimodal embedding for MDD detection and achieved promising performance. However, these methods ignore the heterogeneity/homogeneity among various modalities. Besides, earlier attempts ignore interclass separability and intraclass compactness. Inspired by the above observations, we propose a graph neural network (GNN)-based multimodal fusion strategy named modal-shared modal-specific GNN, which investigates the heterogeneity/homogeneity among various psychophysiological modalities as well as explores the potential relationship between subjects. Specifically, we develop a modal-shared and modal-specific GNN architecture to extract the inter/intramodal characteristics. Furthermore, a reconstruction network is employed to ensure fidelity within the individual modality. Moreover, we impose an attention mechanism on various embeddings to obtain a multimodal compact representation for the subsequent MDD detection task. We conduct extensive experiments on two public depression datasets and the favorable results demonstrate the effectiveness of the proposed algorithm. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ISSN: | 2168-2267 2168-2275 2168-2275 |
DOI: | 10.1109/TCYB.2022.3197127 |