Stress Detection via Multimodal Multitemporal-Scale Fusion: A Hybrid of Deep Learning and Handcrafted Feature Approach

Stress has significant effects on an individual's daily life in modern society, making its detection a topic of great interest over the decade. While numerous studies have delved into this field, the accuracy and reliability of stress detection methods still have room for improvement. In this s...

Full description

Saved in:
Bibliographic Details
Published inIEEE sensors journal Vol. 23; no. 22; pp. 27817 - 27827
Main Authors Zhao, Liang, Niu, Xiaojing, Wang, Lincong, Niu, Jiale, Zhu, Xiaoliang, Dai, Zhicheng
Format Journal Article
LanguageEnglish
Published New York IEEE 15.11.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Stress has significant effects on an individual's daily life in modern society, making its detection a topic of great interest over the decade. While numerous studies have delved into this field, the accuracy and reliability of stress detection methods still have room for improvement. In this study, we propose a multimodal multitemporal-scale fusion-based stress detection system. First, a hybrid feature extraction module is proposed, which generates a feature set from the perspective of handcrafted and deep learning (DL) analysis across multiple temporal scales. Second, a stress detection module is proposed based on multisource feature fusion of electrocardiogram (ECG) and electrodermal activity (EDA) signals, which classifies a subject's state into baseline(/normal), stress, and amusement. In addition, the proposed system is tested on an open-access dataset WESAD using leave-one-out cross validation to verify its performance. The experimental results demonstrate that the proposed system succeeds in learning person-independent features for stress detection with high accuracy.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2023.3314718