HIA-Net: Hierarchical Interactive Alignment Network for Multimodal Few-Shot Emotion Recognition

Physiological multimodal emotion recognition (PMER) has become a key research direction for advancing human-computer interaction and affective computing. However, current PMER methods are affected by significant individual differences and the limited number of samples, making it challenging to captu...

Full description

Saved in:
Bibliographic Details
Published inIEEE signal processing letters Vol. 32; pp. 2679 - 2683
Main Authors Fu, Yuankang, Yang, Kaixiang, Sun, Song, Gong, Xinrong, Zeng, Huanqiang
Format Journal Article
LanguageEnglish
Published New York IEEE 2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Physiological multimodal emotion recognition (PMER) has become a key research direction for advancing human-computer interaction and affective computing. However, current PMER methods are affected by significant individual differences and the limited number of samples, making it challenging to capture complex emotional states comprehensively. To address aforementioned issues, this letter proposes a novel multimodal Few-Shot emotion recognition model, called Hierarchical Interactive Alignment Network (HIA-Net). Specifically, the Hierarchical Adaptive Interactive Attention (HAIA) module of HIA-Net is proposed to capture multidimensional emotional features and aggregate the cross-modal information effectively. Additionally, a cross-domain optimization strategy based on the maximum mean discrepancy is proposed to enhance the HIA-Net's adaptability across varying data distributions. Experimental results show that HIA-Net achieves state-of-the-art performance under Few-Shot experimental paradigms on the SEED and SEED-FRA datasets.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2025.3584273