Recognition of emotional states induced by music videos based on nonlinear feature extraction and SOM classification

This research aims at investigating the relationship between Electroencephalogram (EEG) signals and human emotional states. A subject-independent emotion recognition system is proposed using EEG signals collected during emotional audio-visual inductions to classify different classes of continuous va...

Full description

Saved in:
Bibliographic Details
Published in2014 21th Iranian Conference on Biomedical Engineering (ICBME) pp. 333 - 337
Main Authors Hatamikia, S., Nasrabadi, A. M.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.11.2014
Subjects
Online AccessGet full text
DOI10.1109/ICBME.2014.7043946

Cover

Loading…
Abstract This research aims at investigating the relationship between Electroencephalogram (EEG) signals and human emotional states. A subject-independent emotion recognition system is proposed using EEG signals collected during emotional audio-visual inductions to classify different classes of continuous valence-arousal model. First, four feature extraction methods based on Approximate Entropy, Spectral entropy, Katz's fractal dimension and Petrosian's fractal dimension were used; then, a two-stage feature selection method based on Dunn index and Sequential forward feature selection algorithm (SFS) algorithm was used to select the most informative feature subsets. Self-Organization Map (SOM) classifier was used to classify different emotional classes with the use of 5-fold cross-validation. The best results were achieved using combination of all features by average accuracies of %68.92 and %71.25 for two classes of valence and arousal, respectively. Furthermore, a hierarchical model which was constructed of two classifiers was used for classifying 4 emotional classes of valence and arousal levels and the average accuracy of %55.15 was achieved.
AbstractList This research aims at investigating the relationship between Electroencephalogram (EEG) signals and human emotional states. A subject-independent emotion recognition system is proposed using EEG signals collected during emotional audio-visual inductions to classify different classes of continuous valence-arousal model. First, four feature extraction methods based on Approximate Entropy, Spectral entropy, Katz's fractal dimension and Petrosian's fractal dimension were used; then, a two-stage feature selection method based on Dunn index and Sequential forward feature selection algorithm (SFS) algorithm was used to select the most informative feature subsets. Self-Organization Map (SOM) classifier was used to classify different emotional classes with the use of 5-fold cross-validation. The best results were achieved using combination of all features by average accuracies of %68.92 and %71.25 for two classes of valence and arousal, respectively. Furthermore, a hierarchical model which was constructed of two classifiers was used for classifying 4 emotional classes of valence and arousal levels and the average accuracy of %55.15 was achieved.
Author Hatamikia, S.
Nasrabadi, A. M.
Author_xml – sequence: 1
  givenname: S.
  surname: Hatamikia
  fullname: Hatamikia, S.
  email: shatamikia11@gmail.com
  organization: Dept. of biomedicai Eng., Islamic Azad Univ., Tehran, Iran
– sequence: 2
  givenname: A. M.
  surname: Nasrabadi
  fullname: Nasrabadi, A. M.
  email: nasrabadi@shahed.ac.ir
  organization: Dept. of biomedicai Eng., Shahed Univ., Tehran, Iran
BookMark eNotkFFLwzAUhSPog879AX3JH-jsbdOkedQydbAx0L2Pm_RGAm0qTSbu39u5PZ3DgfMdOHfsOgyBGHuAfAGQ66dV87JZLoocxELlotRCXrG5VjUIpbUSoNQtSx9kh6_gkx8CHxynfjhZ7HhMmChyH9qDpZabI-8P0Vv-41saIjcYp3QqTaOdD4Qjd4TpMBKn3zSi_SdiaPnndsNthzF65y2e4nt247CLNL_ojO1el7vmPVtv31bN8zrzRSFTVlpJypGptEFsCSo0qsXaWhCVKGowUlYoa6lBoYGyFDUqdMpBUYC2eTljj2esJ6L99-h7HI_7yxXlH5VrWrQ
ContentType Conference Proceeding
DBID 6IE
6IL
CBEJK
RIE
RIL
DOI 10.1109/ICBME.2014.7043946
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP All) 1998-Present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
EISBN 9781479974177
9781479974184
1479974188
147997417X
EndPage 337
ExternalDocumentID 7043946
Genre orig-research
GroupedDBID 6IE
6IL
CBEJK
RIE
RIL
ID FETCH-LOGICAL-i226t-3c6e7feb59baade15ab7da8cc1454281b665a686917ab13348a7af7f12219c03
IEDL.DBID RIE
IngestDate Thu Jun 29 18:37:51 EDT 2023
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i226t-3c6e7feb59baade15ab7da8cc1454281b665a686917ab13348a7af7f12219c03
PageCount 5
ParticipantIDs ieee_primary_7043946
PublicationCentury 2000
PublicationDate 2014-11
PublicationDateYYYYMMDD 2014-11-01
PublicationDate_xml – month: 11
  year: 2014
  text: 2014-11
PublicationDecade 2010
PublicationTitle 2014 21th Iranian Conference on Biomedical Engineering (ICBME)
PublicationTitleAbbrev ICBME
PublicationYear 2014
Publisher IEEE
Publisher_xml – name: IEEE
Score 1.6818635
Snippet This research aims at investigating the relationship between Electroencephalogram (EEG) signals and human emotional states. A subject-independent emotion...
SourceID ieee
SourceType Publisher
StartPage 333
SubjectTerms Accuracy
Biomedical engineering
Dunn index
Electroencephalography
Emotion recognition
Entropy
Feature extraction
Fractals
Nonlinear analysis
Self Organization Map (SOM)
Sequential forward feature selection algorithm (SFS)
Title Recognition of emotional states induced by music videos based on nonlinear feature extraction and SOM classification
URI https://ieeexplore.ieee.org/document/7043946
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1NS8NAEF3anjyptOI3c_Bo0m672W2ulpYqREUr9Fb2E0RNJE0P-uud3bQVxYO3EBKy7GN3ZjbvvSHkQmDK4KihkWNURcwlIkq57EV9mRituXE8mPpkt3z6xG7mybxBLrdaGGttIJ_Z2F-Gf_mm0Ct_VNYVQcfJm6SJhVut1droYHpp93p0lY09WYvF6wd_dEwJAWOyS7LNp2qeyEu8qlSsP3-5MP53LHuk8y3Ng_tt0NknDZu3SfWwoQEVORQObN2bR75C0AstAQtvhNCA-oA339gZvPquWIKPYQbwpbx2zJAlOBusPgE37bIWPYDMDTzeZaB9pu2pRQHNDplNxrPRNFq3U4ieMceqooHmVjirklRJaSxNpBJGDrWmLMEihCqERfIhxwJOKuoVulJIJxzt466me4MD0sLB2EMC_jQOV5cZ6oQzZ7BEQWC5N9LRzKa2f0TafsIW77VhxmI9V8d_3z4hOx60WuB3SlpVubJnGOkrdR4g_gK1aa0Q
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1NT8JAEJ0gHvSkBozfzsGjLSx0t_QqgaBSNIoJN7KfiVFbA-Wgv97dLWA0Hrw1TZtu9mV3ZrbvvQG4iG3KYIgigYmICCJD4yBhvBm0OFVSMmWYN_VJR2zwFN1M6KQCl2stjNbak8906C79v3yVy4U7KmvEXsfJNmCTOjFuqdZaKWGaSeO6e5X2HF0rCpeP_uiZ4kNGfwfS1cdKpshLuChEKD9_-TD-dzS7UP8W5-H9OuzsQUVnNSgeVkSgPMPcoC678_BX9IqhOdrS24KoUHzgm2vtjE5_l8_RRTGF9qWs9MzgMzTam32i3bZnpewBeabw8S5F6XJtRy7yeNZh3O-Nu4Ng2VAheLZZVhG0JdOx0YImgnOlCeUiVrwjJYmoLUOIsMBw1mG2hOOCOI0uj7mJDWnZfU022_tQtYPRB4DuPM6uL9WRlEVG2SLFQsuclY6MdKJbh1BzEzZ9Ly0zpsu5Ovr79jlsDcbpcDq8Ht0ew7YDsJT7nUC1mC30qY37hTjzcH8B5XuwWA
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2014+21th+Iranian+Conference+on+Biomedical+Engineering+%28ICBME%29&rft.atitle=Recognition+of+emotional+states+induced+by+music+videos+based+on+nonlinear+feature+extraction+and+SOM+classification&rft.au=Hatamikia%2C+S.&rft.au=Nasrabadi%2C+A.+M.&rft.date=2014-11-01&rft.pub=IEEE&rft.spage=333&rft.epage=337&rft_id=info:doi/10.1109%2FICBME.2014.7043946&rft.externalDocID=7043946