Research on the Multimodal Application Method Based on AT-BiGRU in College Students' Mental Health Education

Predicting the stress conditions of university students and discerning individuals with psychological abnormalities with a certain degree of precision. Integrating various modalities, the state data is fed into the Bidirectional Gated Recurrent Unit (BiGRU) network. Subsequently, the self-attention...

Full description

Saved in:
Bibliographic Details
Published in2024 4th International Conference on Computer Science and Blockchain (CCSB) pp. 159 - 163
Main Author Gui, Jingjing
Format Conference Proceeding
LanguageEnglish
Published IEEE 06.09.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Predicting the stress conditions of university students and discerning individuals with psychological abnormalities with a certain degree of precision. Integrating various modalities, the state data is fed into the Bidirectional Gated Recurrent Unit (BiGRU) network. Subsequently, the self-attention fusion process is applied to combine the three modalities-expressive, physiological, and speech-to derive low-dimensional fusion features. These features are then input into the classifier for emotion category determination. Furthermore, the primary programming model, denoted as AT-BiGRU, was deployed onto the personal computer. The system's hardware circuit and device driver, based on the Cortex-M4 kernel, were formulated to facilitate convenient, low-power multi-channel simultaneous data acquisition, and stress state analysis. Experimental outcomes demonstrate an average classification accuracy of 62.77% among subjects, surpassing some analogous methodologies. This substantiates the efficacy and general applicability of the proposed method in multimodal emotion recognition. Additionally, AT-BiGRU exhibits notable proficiency, achieving an accuracy exceeding 85% in effectively classifying three distinct stress states. This functionality enables the assessment of psychological stress in students.
DOI:10.1109/CCSB63463.2024.10735618