Deep Temporal Analysis for Non-Acted Body Affect Recognition

In the field of body affect recognition, the majority of literature is based on experiments performed on datasets where trained actors simulate emotional reactions. These acted and unnatural expressions differ from the more challenging genuine emotions, thus leading to less valuable results. In this...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on affective computing Vol. 13; no. 3; pp. 1366 - 1377
Main Authors Avola, Danilo, Cinque, Luigi, Fagioli, Alessio, Foresti, Gian Luca, Massaroni, Cristiano
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.07.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN1949-3045
1949-3045
DOI10.1109/TAFFC.2020.3003816

Cover

Loading…
Abstract In the field of body affect recognition, the majority of literature is based on experiments performed on datasets where trained actors simulate emotional reactions. These acted and unnatural expressions differ from the more challenging genuine emotions, thus leading to less valuable results. In this article, a solution for basic non-acted emotion recognition based on 3D skeleton and Deep Neural Networks (DNNs) is provided. The proposed work introduces three majors contributions. First, temporal local movements performed by subjects are examined frame-by-frame, unlike the current state-of-the-art in non-acted body affect recognition where only static or global body features are considered. Second, an original set of global and time-dependent features for body movement description is provided. Third, this is one of the first works to use deep learning methods in the current non-acted body affect recognition literature. Due to the novelty of the topic, only the UCLIC dataset is currently considered the benchmark for comparative tests. On the latter, the proposed method outperforms all the competitors.
AbstractList In the field of body affect recognition, the majority of literature is based on experiments performed on datasets where trained actors simulate emotional reactions. These acted and unnatural expressions differ from the more challenging genuine emotions, thus leading to less valuable results. In this article, a solution for basic non-acted emotion recognition based on 3D skeleton and Deep Neural Networks (DNNs) is provided. The proposed work introduces three majors contributions. First, temporal local movements performed by subjects are examined frame-by-frame, unlike the current state-of-the-art in non-acted body affect recognition where only static or global body features are considered. Second, an original set of global and time-dependent features for body movement description is provided. Third, this is one of the first works to use deep learning methods in the current non-acted body affect recognition literature. Due to the novelty of the topic, only the UCLIC dataset is currently considered the benchmark for comparative tests. On the latter, the proposed method outperforms all the competitors.
Author Foresti, Gian Luca
Cinque, Luigi
Fagioli, Alessio
Massaroni, Cristiano
Avola, Danilo
Author_xml – sequence: 1
  givenname: Danilo
  orcidid: 0000-0001-9437-6217
  surname: Avola
  fullname: Avola, Danilo
  email: avola@di.uniroma1.it
  organization: Department of Computer Science, Sapienza University, Rome, Italy
– sequence: 2
  givenname: Luigi
  orcidid: 0000-0001-9149-2175
  surname: Cinque
  fullname: Cinque, Luigi
  email: cinque@di.uniroma1.it
  organization: Department of Computer Science, Sapienza University, Rome, Italy
– sequence: 3
  givenname: Alessio
  orcidid: 0000-0002-8111-9120
  surname: Fagioli
  fullname: Fagioli, Alessio
  email: fagioli@di.uniroma1.it
  organization: Department of Computer Science, Sapienza University, Rome, Italy
– sequence: 4
  givenname: Gian Luca
  orcidid: 0000-0002-8425-6892
  surname: Foresti
  fullname: Foresti, Gian Luca
  email: gianluca.foresti@uniud.it
  organization: Department of Mathematics, Computer Science and Physics, University of Udine, Udine, Italy
– sequence: 5
  givenname: Cristiano
  orcidid: 0000-0002-6942-4851
  surname: Massaroni
  fullname: Massaroni, Cristiano
  email: massaroni@di.uniroma1.it
  organization: Department of Computer Science, Sapienza University, Rome, Italy
BookMark eNp9kE1LAzEQhoMoWGv_gF4WPG_N52YDXtZqVSgKUs8hZieyZbtZk_TQf-_WFhEPzmXm8D7Dy3OGjjvfAUIXBE8Jwep6Wc3nsynFFE8ZxqwkxREaEcVVzjAXx7_uUzSJcYWHYYwVVI7QzR1Any1h3ftg2qzqTLuNTcycD9mz7_LKJqizW19vs8o5sCl7Bes_uiY1vjtHJ860ESaHPUZv8_vl7DFfvDw8zapFbqkSKZeKSMlJDVJa4EZJXhtTlgZqYLR-F4Ja6ogkhmHJOOUll2VNLQjriCuEZGN0tf_bB_-5gZj0ym_CUDVqKrEioigEGVLlPmWDjzGA07ZJZtczBdO0mmC906W_demdLn3QNaD0D9qHZm3C9n_ocg81APADKEJJoQT7AhiAdaU
CODEN ITACBQ
CitedBy_id crossref_primary_10_1109_ACCESS_2025_3534145
crossref_primary_10_1016_j_knosys_2024_111856
crossref_primary_10_1109_TAI_2022_3159614
crossref_primary_10_3390_app132413322
crossref_primary_10_1016_j_eswa_2025_126427
crossref_primary_10_1007_s11042_020_10106_1
crossref_primary_10_1109_TSMC_2024_3523342
crossref_primary_10_3390_info15110721
crossref_primary_10_1016_j_eswa_2023_121419
crossref_primary_10_1142_S0129065720500689
crossref_primary_10_1142_S0129065724500242
crossref_primary_10_1016_j_knosys_2024_112744
crossref_primary_10_1142_S012906572250040X
crossref_primary_10_1016_j_eswa_2023_121981
crossref_primary_10_3390_disabilities2040044
crossref_primary_10_1038_s41598_021_98856_2
Cites_doi 10.1109/T-AFFC.2012.16
10.1109/TAFFC.2017.2740923
10.1109/TAFFC.2015.2390627
10.1073/pnas.0507650102
10.1109/ACIIW.2019.8925084
10.1016/j.ijhcs.2007.02.003
10.1145/3341163.3347728
10.1109/TMM.2019.2960588
10.1109/T-AFFC.2011.7
10.1109/T-AFFC.2013.29
10.1109/TAFFC.2018.2817622
10.1002/ejsp.2420010307
10.1167/4.8.232
10.3758/bf03192758
10.1007/11573548_1
10.1109/ACII.2009.5349316
10.1109/ICCVW.2011.6130446
10.1145/954339.954342
10.1109/TPAMI.2008.52
10.1109/TASLP.2017.2764271
10.1109/CVPR.2016.115
10.1016/j.intcom.2006.04.003
10.1177/0092070399272005
10.1162/neco.1997.9.8.1735
10.1007/978-3-540-74889-2_5
10.1109/TASLP.2019.2948773
10.1109/ICCV.2017.236
10.1109/TCIAIG.2012.2202663
10.1109/TAFFC.2018.2798576
10.1037/h0027349
10.2478/v10053-008-0006-3
10.1109/ICCVW.2017.327
10.1109/TPAMI.2016.2515606
10.1037/1528-3542.7.3.487
10.1109/TSMCB.2010.2103557
10.1109/TMM.2018.2856094
10.1109/TAFFC.2018.2874986
10.1016/j.neunet.2008.05.003
10.1023/b:jonb.0000023655.25550.be
10.1109/FUZZ-IEEE.2012.6250780
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022
DBID 97E
RIA
RIE
AAYXX
CITATION
7SC
8FD
JQ2
L7M
L~C
L~D
DOI 10.1109/TAFFC.2020.3003816
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Computer and Information Systems Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
DatabaseTitle CrossRef
Computer and Information Systems Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Advanced Technologies Database with Aerospace
ProQuest Computer Science Collection
Computer and Information Systems Abstracts Professional
DatabaseTitleList
Computer and Information Systems Abstracts
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Xplore Digital Library
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISSN 1949-3045
EndPage 1377
ExternalDocumentID 10_1109_TAFFC_2020_3003816
9121695
Genre orig-research
GrantInformation_xml – fundername: Ministero dell’Istruzione, dell’Università e della Ricerca; MIUR
  funderid: 10.13039/501100003407
GroupedDBID 0R~
4.4
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABJNI
ABQJQ
ABVLG
AENEX
AGQYO
AGSQL
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
HZ~
IEDLZ
IFIPE
IPLJI
JAVBF
M43
O9-
OCL
PQQKQ
RIA
RIE
RNI
RZB
AAYXX
CITATION
7SC
8FD
JQ2
L7M
L~C
L~D
ID FETCH-LOGICAL-c295t-7917741de77ce4a974daa88aede32db552c2f171a30734248478d2ce5cf1f6573
IEDL.DBID RIE
ISSN 1949-3045
IngestDate Sun Jun 29 16:13:20 EDT 2025
Thu Apr 24 23:00:45 EDT 2025
Tue Jul 01 02:57:53 EDT 2025
Wed Aug 27 02:29:15 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 3
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c295t-7917741de77ce4a974daa88aede32db552c2f171a30734248478d2ce5cf1f6573
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0001-9437-6217
0000-0002-8425-6892
0000-0002-6942-4851
0000-0001-9149-2175
0000-0002-8111-9120
PQID 2709156651
PQPubID 2040414
PageCount 12
ParticipantIDs proquest_journals_2709156651
crossref_citationtrail_10_1109_TAFFC_2020_3003816
ieee_primary_9121695
crossref_primary_10_1109_TAFFC_2020_3003816
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2022-07-01
PublicationDateYYYYMMDD 2022-07-01
PublicationDate_xml – month: 07
  year: 2022
  text: 2022-07-01
  day: 01
PublicationDecade 2020
PublicationPlace Piscataway
PublicationPlace_xml – name: Piscataway
PublicationTitle IEEE transactions on affective computing
PublicationTitleAbbrev TAFFC
PublicationYear 2022
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
Hermans (ref44)
ref12
ref15
ref14
Jozefowicz (ref50)
ref11
ref10
ref54
Tamara (ref3) 2014; 5
ref17
ref16
ref19
ref18
Chung (ref51)
Popescu (ref20) 2009; 8
Zoph (ref53)
ref46
ref45
ref42
ref41
Ekman (ref7) 1978
ref43
ref9
ref4
ref6
ref5
ref40
Grafsgaard (ref35)
ref34
ref37
ref36
ref31
ref30
Abadi (ref48) 2015
ref33
ref32
ref2
ref39
ref38
Greenwald (ref1) 1989; 3
Srivastava (ref49) 2014; 15
ref24
ref23
ref26
ref25
ref22
ref21
Duchi (ref47) 2011; 12
ref28
Collins (ref52)
ref27
Michael (ref8) 1988
ref29
References_xml – volume: 5
  start-page: 859
  issue: 8
  year: 2014
  ident: ref3
  article-title: A longitudinal analysis of the relationship between positive and negative affect and health,
  publication-title: J. Psychol.
– ident: ref29
  doi: 10.1109/T-AFFC.2012.16
– ident: ref27
  doi: 10.1109/TAFFC.2017.2740923
– start-page: 438
  volume-title: Proc. Int. Florida Artif. Intell. Res. Soc. Conf.
  ident: ref35
  article-title: Analyzing posture and affect in task-oriented tutoring
– ident: ref38
  doi: 10.1109/TAFFC.2015.2390627
– ident: ref9
  doi: 10.1073/pnas.0507650102
– ident: ref41
  doi: 10.1109/ACIIW.2019.8925084
– ident: ref34
  doi: 10.1016/j.ijhcs.2007.02.003
– ident: ref40
  doi: 10.1145/3341163.3347728
– ident: ref18
  doi: 10.1109/TMM.2019.2960588
– ident: ref11
  doi: 10.1109/T-AFFC.2011.7
– ident: ref31
  doi: 10.1109/T-AFFC.2013.29
– ident: ref23
  doi: 10.1109/TAFFC.2018.2817622
– ident: ref43
  doi: 10.1002/ejsp.2420010307
– ident: ref36
  doi: 10.1167/4.8.232
– ident: ref13
  doi: 10.3758/bf03192758
– ident: ref12
  doi: 10.1007/11573548_1
– ident: ref24
  doi: 10.1109/ACII.2009.5349316
– ident: ref25
  doi: 10.1109/ICCVW.2011.6130446
– ident: ref5
  doi: 10.1145/954339.954342
– volume: 15
  start-page: 1929
  issue: 1
  year: 2014
  ident: ref49
  article-title: Dropout: A simple way to prevent neural networks from overfitting
  publication-title: J. Mach. Learn. Res.
– ident: ref4
  doi: 10.1109/TPAMI.2008.52
– ident: ref21
  doi: 10.1109/TASLP.2017.2764271
– ident: ref54
  doi: 10.1109/CVPR.2016.115
– volume-title: Bodily Communication
  year: 1988
  ident: ref8
– volume-title: Facial Action Coding System (FACS): Manual
  year: 1978
  ident: ref7
– volume: 8
  start-page: 579
  issue: 7
  year: 2009
  ident: ref20
  article-title: Multilayer perceptron and neural networks
  publication-title: WSEAS Trans. Circuits Syst.
– ident: ref32
  doi: 10.1016/j.intcom.2006.04.003
– ident: ref2
  doi: 10.1177/0092070399272005
– ident: ref19
  doi: 10.1162/neco.1997.9.8.1735
– ident: ref33
  doi: 10.1007/978-3-540-74889-2_5
– volume: 12
  start-page: 2121
  year: 2011
  ident: ref47
  article-title: Adaptive subgradient methods for online learning and stochastic optimization
  publication-title: J. Mach. Learn. Res.
– ident: ref22
  doi: 10.1109/TASLP.2019.2948773
– start-page: 190
  volume-title: Proc. Int. Conf. Neural Inf. Process. Syst.
  ident: ref44
  article-title: Training and analyzing deep recurrent neural networks
– ident: ref46
  doi: 10.1109/ICCV.2017.236
– ident: ref37
  doi: 10.1109/TCIAIG.2012.2202663
– ident: ref39
  doi: 10.1109/TAFFC.2018.2798576
– ident: ref42
  doi: 10.1037/h0027349
– ident: ref28
  doi: 10.2478/v10053-008-0006-3
– year: 2015
  ident: ref48
  article-title: TensorFlow: Large-scale machine learning on heterogeneous systems
– ident: ref26
  doi: 10.1109/ICCVW.2017.327
– ident: ref6
  doi: 10.1109/TPAMI.2016.2515606
– ident: ref10
  doi: 10.1037/1528-3542.7.3.487
– ident: ref16
  doi: 10.1109/TSMCB.2010.2103557
– start-page: 1
  volume-title: Proc. Int. Conf. Learn. Representations
  ident: ref52
  article-title: Capacity and trainability in recurrent neural networks
– ident: ref45
  doi: 10.1109/TMM.2018.2856094
– start-page: 2342
  volume-title: Proc. Int. Conf. Mach. Learn.
  ident: ref50
  article-title: An empirical exploration of recurrent network architectures
– start-page: 1
  volume-title: Proc. Int. Conf. Learn. Representations
  ident: ref53
  article-title: Neural architecture search with reinforcement learning
– ident: ref30
  doi: 10.1109/TAFFC.2018.2874986
– volume: 3
  start-page: 51
  issue: 1
  year: 1989
  ident: ref1
  article-title: Affective judgment and psychophysiological response: Dimensional covariation in the evaluation of pictorial stimuli
  publication-title: J. Psychophysiol.
– ident: ref15
  doi: 10.1016/j.neunet.2008.05.003
– ident: ref14
  doi: 10.1023/b:jonb.0000023655.25550.be
– ident: ref17
  doi: 10.1109/FUZZ-IEEE.2012.6250780
– start-page: 1
  volume-title: Proc. Int. Conf. Neural Inf. Process. Syst.
  ident: ref51
  article-title: Empirical evaluation of gated recurrent neural networks on sequence modeling
SSID ssj0000333627
Score 2.4088273
Snippet In the field of body affect recognition, the majority of literature is based on experiments performed on datasets where trained actors simulate emotional...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 1366
SubjectTerms 3D skeleton
Artificial neural networks
automatic emotion recognition
body movement
Datasets
deep learning
Emotion recognition
Emotions
Feature extraction
Games
long short-term memory (LSTM)
Machine learning
Non-acted affective computing
Pain
recurrent neural network (RNN)
Skeleton
Task analysis
Three-dimensional displays
Title Deep Temporal Analysis for Non-Acted Body Affect Recognition
URI https://ieeexplore.ieee.org/document/9121695
https://www.proquest.com/docview/2709156651
Volume 13
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV07T8MwED61nVgoUBCFgjywQdrYieNEYimFqEJqB9RK3aL4kQWUVJAO8OuxnYfEQ4jNw1myfD7fd_bddwBXgXCpCKPM0c5GOH7m-k7IU-LwjHCsqA6LLMXGYhnM1_7jhm46cNPWwiilbPKZGpuh_cuXhdiZp7JJhAkOItqFrg7cqlqt9j3F9Tx9F7OmLsaNJqtpHM90BEh0YFp9kH3xPbaZyo8b2LqVuA-LZkFVNsnzeFfysfj4xtX43xUfwH6NL9G0OhCH0FH5EfSb3g2oNuUB3N4rtUWriphKy9fcJEhjWLQscmcqNBRFd4V8R1Ob8oGemlSjIj-Gdfywms2dupOCI0hES8NJqWEelooxofxUxxAyTcMwVVJ5RHJKiSAZZjg1Fu8TX7usUBKhqMhwFlDmnUAvL3J1Coh6nHOSMcFdZYhlIkmZEKZnucvTQOIh4GaPE1HTjJtuFy-JDTfcKLF6SYxeklovQ7hu52wrko0_pQdmo1vJeo-HMGpUmdR2-JYQpvGQRqwUn_0-6xz2iClosAm4I-iVrzt1oWFGyS_t-foENS_NUw
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV07T8MwED6VMsBCgYIoFPDABmljJ85DYgmFqkDbAbVStyh-ZAElFbQD_Hps5yHxEGLzcJYsn8_3nX33HcCFx23KgzC1lLPhlpvarhWwhFgsJQxLqsIiQ7ExmXqjufuwoIsGXNW1MFJKk3wme3po_vJFztf6qawfYoK9kG7ApvL7blhUa9UvKrbjqNvYrypj7LA_i4bDgYoBiQpNiy-yL97HtFP5cQcbxzJswaRaUpFP8txbr1iPf3xja_zvmndhp0SYKCqOxB40ZLYPrap7AyqNuQ3Xt1Iu0aygplLyJTsJUigWTfPMirgCo-gmF-8oMkkf6KlKNsqzA5gP72aDkVX2UrA4CelKs1IqoIeF9H0u3URFESJJgiCRQjpEMEoJJyn2caJt3iWuclqBIFxSnuLUo75zCM0sz-QRIOowxkjqc2ZLTS0TCupzrruW2yzxBO4ArvY45iXRuO538RKbgMMOY6OXWOslLvXSgct6zrKg2fhTuq03upYs97gD3UqVcWmJbzHxFSJSmJXi499nncPWaDYZx-P76eMJbBNd3mDScbvQXL2u5akCHSt2Zs7aJylq0KM
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Deep+Temporal+Analysis+for+Non-Acted+Body+Affect+Recognition&rft.jtitle=IEEE+transactions+on+affective+computing&rft.au=Avola%2C+Danilo&rft.au=Cinque%2C+Luigi&rft.au=Fagioli%2C+Alessio&rft.au=esti%2C+Gian+Luca&rft.date=2022-07-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.eissn=1949-3045&rft.volume=13&rft.issue=3&rft.spage=1366&rft_id=info:doi/10.1109%2FTAFFC.2020.3003816&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1949-3045&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1949-3045&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1949-3045&client=summon