Identifying dominant emotional state using handwriting and drawing samples by fusing features

The expertise to quickly and non-invasively determine a subject’s emotional state can contribute to a milestone in research on emotionally intelligent computing systems. The ability to identify emotions via everyday activities such as writing and drawing is beneficial to one’s well-being. Tablet dev...

Full description

Saved in:
Bibliographic Details
Published inApplied intelligence (Dordrecht, Netherlands) Vol. 53; no. 3; pp. 2798 - 2814
Main Authors Rahman, Atta Ur, Halim, Zahid
Format Journal Article
LanguageEnglish
Published New York Springer US 01.02.2023
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The expertise to quickly and non-invasively determine a subject’s emotional state can contribute to a milestone in research on emotionally intelligent computing systems. The ability to identify emotions via everyday activities such as writing and drawing is beneficial to one’s well-being. Tablet devices and other human-machine interfaces have made collecting handwriting and drawing samples simpler. To understand more about writing and drawing signals there is a need to investigate them in the temporal, spectral, and cepstral domains for discovering new insights. Extracting more information will help to improve classification accuracy. This study combines temporal, spectral, and Mel Frequency Cepstral Coefficient (MFCC) methods to extract features from such signals, and finds its correlation with depression, anxiety, and stress emotional state of the humans. To examine spatial features, velocities are also computed as variations of displacement in the x- and y-directions while performing the tasks. Bidirectional Long-Short Term Memory (BiLSTM) network is used to classify the generated features’ vectors. To evaluate the proposed work, multiple publically available benchmark datasets are utilized. This work determine which activities and features help describe a specific emotional state through in-depth investigation. The results of the experiments demonstrate that fusing several features improve recognition accuracy significantly. For emotions identification like depression, anxiety, and stress states, this work achieved a higher classification improvement ranging from 5.32% to 8.9% as compared to the baseline approaches.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0924-669X
1573-7497
DOI:10.1007/s10489-022-03552-x