Stress Detection via Multimodal Multitemporal-Scale Fusion: A Hybrid of Deep Learning and Handcrafted Feature Approach
Stress has significant effects on an individual's daily life in modern society, making its detection a topic of great interest over the decade. While numerous studies have delved into this field, the accuracy and reliability of stress detection methods still have room for improvement. In this s...
Saved in:
Published in | IEEE sensors journal Vol. 23; no. 22; pp. 27817 - 27827 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
15.11.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Stress has significant effects on an individual's daily life in modern society, making its detection a topic of great interest over the decade. While numerous studies have delved into this field, the accuracy and reliability of stress detection methods still have room for improvement. In this study, we propose a multimodal multitemporal-scale fusion-based stress detection system. First, a hybrid feature extraction module is proposed, which generates a feature set from the perspective of handcrafted and deep learning (DL) analysis across multiple temporal scales. Second, a stress detection module is proposed based on multisource feature fusion of electrocardiogram (ECG) and electrodermal activity (EDA) signals, which classifies a subject's state into baseline(/normal), stress, and amusement. In addition, the proposed system is tested on an open-access dataset WESAD using leave-one-out cross validation to verify its performance. The experimental results demonstrate that the proposed system succeeds in learning person-independent features for stress detection with high accuracy. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1530-437X 1558-1748 |
DOI: | 10.1109/JSEN.2023.3314718 |