Recognition of Shooter's Emotions Under Stress Based on Affective Computing

This paper attempts to accurately recognize the emotional states of shooters and prepare practical adjustment strategies for these athletes. First, an experimental paradigm was designed for emotion induction, which effectively arouses the calmness, sadness, fear, and happiness, four common emotions,...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 7; pp. 62338 - 62343
Main Authors Liu, Yuanguo, Jiang, Chi
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper attempts to accurately recognize the emotional states of shooters and prepare practical adjustment strategies for these athletes. First, an experimental paradigm was designed for emotion induction, which effectively arouses the calmness, sadness, fear, and happiness, four common emotions, of the subjects with emotion-inducing video clips. The electrodermal activities (EDAs) collected under the four emotions were employed to analyze the emotions of the subjects, and six eigenvectors were identified by particle swarm optimization (PSO) to effectively recognize stress emotions. Then, the k-nearest neighbor's algorithm (kNNs) was adopted to calculate the emotions under stress and recognize the stress degree. To improve recognition accuracy, the captured emotions were subjected to baseline removal and PSO feature optimization. Furthermore, an emotional model for the athletes was set up based on the emotional probability space of the Markov chain (MC). This model can accurately simulate the emotional features and emotional states of the subjects. The research results provide a good reference for the objective evaluation of athletes' emotions and the maintenance of stable and proper emotional states of shooters in daily training and competition.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2916147