EEG-Based Emotion Recognition in Music Listening

Ongoing brain activity can be recorded as electroen-cephalograph (EEG) to discover the links between emotional states and brain activity. This study applied machine-learning algorithms to categorize EEG dynamics according to subject self-reported emotional states during music listening. A framework...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on biomedical engineering Vol. 57; no. 7; pp. 1798 - 1806
Main Authors Lin, Yuan-Pin, Wang, Chi-Hong, Jung, Tzyy-Ping, Wu, Tien-Lin, Jeng, Shyh-Kang, Duann, Jeng-Ren, Chen, Jyh-Horng
Format Journal Article
LanguageEnglish
Published New York, NY IEEE 01.07.2010
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Ongoing brain activity can be recorded as electroen-cephalograph (EEG) to discover the links between emotional states and brain activity. This study applied machine-learning algorithms to categorize EEG dynamics according to subject self-reported emotional states during music listening. A framework was proposed to optimize EEG-based emotion recognition by systematically 1) seeking emotion-specific EEG features and 2) exploring the efficacy of the classifiers. Support vector machine was employed to classify four emotional states (joy, anger, sadness, and pleasure) and obtained an averaged classification accuracy of 82.29% ± 3.06% across 26 subjects. Further, this study identified 30 subject-independent features that were most relevant to emotional processing across subjects and explored the feasibility of using fewer electrodes to characterize the EEG dynamics during music listening. The identified features were primarily derived from electrodes placed near the frontal and the parietal lobes, consistent with many of the findings in the literature. This study might lead to a practical system for noninvasive assessment of the emotional states in practical or clinical applications.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0018-9294
1558-2531
1558-2531
DOI:10.1109/TBME.2010.2048568