The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction

In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emo...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 20; no. 8; p. 2308
Main Authors Hazer-Rau, Dilana, Meudt, Sascha, Daucher, Andreas, Spohrs, Jennifer, Hoffmann, Holger, Schwenker, Friedhelm, Traue, Harald C.
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 17.04.2020
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
Abstract In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emotional and cognitive load states. It consists of six experimental sequences, inducing Interest, Overload, Normal, Easy, Underload, and Frustration. Each sequence is followed by subjective feedbacks to validate the induction, a respiration baseline to level off the physiological reactions, and a summary of results. Further, prior to the experiment, three questionnaires related to emotion regulation (ERQ), emotional control (TEIQue-SF), and personality traits (TIPI) were collected from each subject to evaluate the stability of the induction paradigm. Based on this HCI scenario, the University of Ulm Multimodal Affective Corpus (uulmMAC), consisting of two homogenous samples of 60 participants and 100 recording sessions was generated. We recorded 16 sensor modalities including 4 × video, 3 × audio, and 7 × biophysiological, depth, and pose streams. Further, additional labels and annotations were also collected. After recording, all data were post-processed and checked for technical and signal quality, resulting in the final uulmMAC dataset of 57 subjects and 95 recording sessions. The evaluation of the reported subjective feedbacks shows significant differences between the sequences, well consistent with the induced states, and the analysis of the questionnaires shows stable results. In summary, our uulmMAC database is a valuable contribution for the field of affective computing and multimodal data analysis: Acquired in a mobile interactive scenario close to real HCI, it consists of a large number of subjects and allows transtemporal investigations. Validated via subjective feedbacks and checked for quality issues, it can be used for affective computing and machine learning applications.
AbstractList In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emotional and cognitive load states. It consists of six experimental sequences, inducing Interest, Overload, Normal, Easy, Underload, and Frustration. Each sequence is followed by subjective feedbacks to validate the induction, a respiration baseline to level off the physiological reactions, and a summary of results. Further, prior to the experiment, three questionnaires related to emotion regulation (ERQ), emotional control (TEIQue-SF), and personality traits (TIPI) were collected from each subject to evaluate the stability of the induction paradigm. Based on this HCI scenario, the University of Ulm Multimodal Affective Corpus (uulmMAC), consisting of two homogenous samples of 60 participants and 100 recording sessions was generated. We recorded 16 sensor modalities including 4 × video, 3 × audio, and 7 × biophysiological, depth, and pose streams. Further, additional labels and annotations were also collected. After recording, all data were post-processed and checked for technical and signal quality, resulting in the final uulmMAC dataset of 57 subjects and 95 recording sessions. The evaluation of the reported subjective feedbacks shows significant differences between the sequences, well consistent with the induced states, and the analysis of the questionnaires shows stable results. In summary, our uulmMAC database is a valuable contribution for the field of affective computing and multimodal data analysis: Acquired in a mobile interactive scenario close to real HCI, it consists of a large number of subjects and allows transtemporal investigations. Validated via subjective feedbacks and checked for quality issues, it can be used for affective computing and machine learning applications.
In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emotional and cognitive load states. It consists of six experimental sequences, inducing , and . Each sequence is followed by subjective feedbacks to validate the induction, a respiration baseline to level off the physiological reactions, and a summary of results. Further, prior to the experiment, three questionnaires related to emotion regulation (ERQ), emotional control (TEIQue-SF), and personality traits (TIPI) were collected from each subject to evaluate the stability of the induction paradigm. Based on this HCI scenario, the , consisting of two homogenous samples of 60 participants and 100 recording sessions was generated. We recorded 16 sensor modalities including 4 × video, 3 × audio, and 7 × biophysiological, depth, and pose streams. Further, additional labels and annotations were also collected. After recording, all data were post-processed and checked for technical and signal quality, resulting in the final dataset of 57 subjects and 95 recording sessions. The evaluation of the reported subjective feedbacks shows significant differences between the sequences, well consistent with the induced states, and the analysis of the questionnaires shows stable results. In summary, our database is a valuable contribution for the field of affective computing and multimodal data analysis: Acquired in a mobile interactive scenario close to real HCI, it consists of a large number of subjects and allows transtemporal investigations. Validated via subjective feedbacks and checked for quality issues, it can be used for affective computing and machine learning applications.
In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emotional and cognitive load states. It consists of six experimental sequences, inducing Interest, Overload, Normal, Easy, Underload, and Frustration. Each sequence is followed by subjective feedbacks to validate the induction, a respiration baseline to level off the physiological reactions, and a summary of results. Further, prior to the experiment, three questionnaires related to emotion regulation (ERQ), emotional control (TEIQue-SF), and personality traits (TIPI) were collected from each subject to evaluate the stability of the induction paradigm. Based on this HCI scenario, the University of Ulm Multimodal Affective Corpus (uulmMAC), consisting of two homogenous samples of 60 participants and 100 recording sessions was generated. We recorded 16 sensor modalities including 4 × video, 3 × audio, and 7 × biophysiological, depth, and pose streams. Further, additional labels and annotations were also collected. After recording, all data were post-processed and checked for technical and signal quality, resulting in the final uulmMAC dataset of 57 subjects and 95 recording sessions. The evaluation of the reported subjective feedbacks shows significant differences between the sequences, well consistent with the induced states, and the analysis of the questionnaires shows stable results. In summary, our uulmMAC database is a valuable contribution for the field of affective computing and multimodal data analysis: Acquired in a mobile interactive scenario close to real HCI, it consists of a large number of subjects and allows transtemporal investigations. Validated via subjective feedbacks and checked for quality issues, it can be used for affective computing and machine learning applications.In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emotional and cognitive load states. It consists of six experimental sequences, inducing Interest, Overload, Normal, Easy, Underload, and Frustration. Each sequence is followed by subjective feedbacks to validate the induction, a respiration baseline to level off the physiological reactions, and a summary of results. Further, prior to the experiment, three questionnaires related to emotion regulation (ERQ), emotional control (TEIQue-SF), and personality traits (TIPI) were collected from each subject to evaluate the stability of the induction paradigm. Based on this HCI scenario, the University of Ulm Multimodal Affective Corpus (uulmMAC), consisting of two homogenous samples of 60 participants and 100 recording sessions was generated. We recorded 16 sensor modalities including 4 × video, 3 × audio, and 7 × biophysiological, depth, and pose streams. Further, additional labels and annotations were also collected. After recording, all data were post-processed and checked for technical and signal quality, resulting in the final uulmMAC dataset of 57 subjects and 95 recording sessions. The evaluation of the reported subjective feedbacks shows significant differences between the sequences, well consistent with the induced states, and the analysis of the questionnaires shows stable results. In summary, our uulmMAC database is a valuable contribution for the field of affective computing and multimodal data analysis: Acquired in a mobile interactive scenario close to real HCI, it consists of a large number of subjects and allows transtemporal investigations. Validated via subjective feedbacks and checked for quality issues, it can be used for affective computing and machine learning applications.
In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emotional and cognitive load states. It consists of six experimental sequences, inducing Interest, Overload, Normal, Easy, Underload , and Frustration . Each sequence is followed by subjective feedbacks to validate the induction, a respiration baseline to level off the physiological reactions, and a summary of results. Further, prior to the experiment, three questionnaires related to emotion regulation (ERQ), emotional control (TEIQue-SF), and personality traits (TIPI) were collected from each subject to evaluate the stability of the induction paradigm. Based on this HCI scenario, the University of Ulm Multimodal Affective Corpus (uulmMAC) , consisting of two homogenous samples of 60 participants and 100 recording sessions was generated. We recorded 16 sensor modalities including 4 × video, 3 × audio, and 7 × biophysiological, depth, and pose streams. Further, additional labels and annotations were also collected. After recording, all data were post-processed and checked for technical and signal quality, resulting in the final uulmMAC dataset of 57 subjects and 95 recording sessions. The evaluation of the reported subjective feedbacks shows significant differences between the sequences, well consistent with the induced states, and the analysis of the questionnaires shows stable results. In summary, our uulmMAC database is a valuable contribution for the field of affective computing and multimodal data analysis: Acquired in a mobile interactive scenario close to real HCI, it consists of a large number of subjects and allows transtemporal investigations. Validated via subjective feedbacks and checked for quality issues, it can be used for affective computing and machine learning applications.
Author Daucher, Andreas
Hazer-Rau, Dilana
Spohrs, Jennifer
Meudt, Sascha
Hoffmann, Holger
Traue, Harald C.
Schwenker, Friedhelm
AuthorAffiliation 1 Section Medical Psychology, University of Ulm, Frauensteige 6, 89075 Ulm, Germany
2 Institute of Neural Information Processing, University of Ulm, James-Frank-Ring, 89081 Ulm, Germany
AuthorAffiliation_xml – name: 2 Institute of Neural Information Processing, University of Ulm, James-Frank-Ring, 89081 Ulm, Germany
– name: 1 Section Medical Psychology, University of Ulm, Frauensteige 6, 89075 Ulm, Germany
Author_xml – sequence: 1
  givenname: Dilana
  surname: Hazer-Rau
  fullname: Hazer-Rau, Dilana
– sequence: 2
  givenname: Sascha
  surname: Meudt
  fullname: Meudt, Sascha
– sequence: 3
  givenname: Andreas
  surname: Daucher
  fullname: Daucher, Andreas
– sequence: 4
  givenname: Jennifer
  surname: Spohrs
  fullname: Spohrs, Jennifer
– sequence: 5
  givenname: Holger
  surname: Hoffmann
  fullname: Hoffmann, Holger
– sequence: 6
  givenname: Friedhelm
  orcidid: 0000-0001-5118-0812
  surname: Schwenker
  fullname: Schwenker, Friedhelm
– sequence: 7
  givenname: Harald C.
  surname: Traue
  fullname: Traue, Harald C.
BackLink https://www.ncbi.nlm.nih.gov/pubmed/32316626$$D View this record in MEDLINE/PubMed
BookMark eNptkstuEzEUhi1URNvAghdAI7GBxVBf5uLZIEXh0kit2JS1deI5Th3N2MEeV-qOh-AJeRIcUqKmYmNbvz__-n3OOScnzjsk5DWjH4To6EXklEouqHxGzljFq1JyTk8enU_JeYwbSrkQQr4gp4IL1jS8OSObm1ssUhrG6_mi-AQTrCDi75-_5sV1GiY7-h6GYm4M6sneYbHwYZtiYXw4EsdtmqxbF9YVl2kEV-4lDMXS5RUy591L8tzAEPHVwz4j3798vllcllffvi4X86tSi66bSmxEL5F2VIMBTSEnlpUUtDYosQEuDG2FNJ3mzEjsgLPeYIV1r43JVUAxI8u9b-9ho7bBjhDulQer_go-rBWEyeoBVcWMANq0lelNtVpRaGlf91ygqZts32Svj3uvbVqN2Gt0U4DhyPT4xtlbtfZ3quWsow3LBu8eDIL_kTBOarRR4zCAQ5-i4qITdVvx3JoZefsE3fgUXC7VjuIdpazaJXrzONEhyr-WZuD9HtDBxxjQHBBG1W5c1GFcMnvxhNV2gl2z8mfs8J8XfwCY7sJf
CitedBy_id crossref_primary_10_3390_s21051579
crossref_primary_10_3390_s23010340
crossref_primary_10_1088_1742_6596_1883_1_012079
crossref_primary_10_3390_app122211457
crossref_primary_10_1155_2024_7499554
crossref_primary_10_1007_s13198_023_02106_9
crossref_primary_10_1109_MSP_2021_3106285
crossref_primary_10_1109_TIM_2022_3204940
crossref_primary_10_3390_mti9030028
crossref_primary_10_3390_s21072273
crossref_primary_10_1080_10494820_2023_2195443
crossref_primary_10_1109_ACCESS_2020_3023871
crossref_primary_10_1145_3572915
crossref_primary_10_3390_s20174788
crossref_primary_10_1002_widm_1563
crossref_primary_10_1680_jnaen_19_00044
crossref_primary_10_3233_JCM_247554
Cites_doi 10.1007/978-3-319-28109-4
10.1109/SSCI.2015.251
10.1007/s11063-017-9719-y
10.3389/fnbeh.2015.00021
10.1016/j.neuroimage.2009.01.016
10.1177/001872089303500205
10.3102/978-0-935302-42-4_5
10.1007/978-3-319-39510-4_8
10.1016/S1071-5819(03)00047-8
10.1016/S0953-5438(01)00053-4
10.1145/2668056.2668062
10.1145/2663204.2663257
10.1016/j.aap.2013.02.008
10.1109/FUZZ-IEEE.2012.6250778
10.1145/3173574.3174226
10.1109/CVPRW.2014.62
10.1145/1518701.1519035
10.1109/TITB.2011.2169804
10.1111/j.1469-8986.2010.01069.x
10.1016/S0092-6566(03)00046-1
10.1027/1015-5759.23.3.166
10.3758/s13428-011-0159-8
10.1007/s11042-018-6518-z
10.1111/j.1467-8721.2008.00548.x
10.1016/j.jml.2011.04.004
10.1007/978-1-4419-8126-4
10.1109/BIBE.2013.6701686
10.1016/j.paid.2014.09.003
10.1037/h0054651
10.1145/2750858.2805847
10.1080/14639220210159735
10.1007/978-3-319-31700-7
10.1109/ACII.2013.105
10.1007/978-3-319-39516-6_22
10.1037/0022-3514.85.2.348
10.1026/0012-1924.55.3.144
10.1023/A:1022193728205
10.1109/T-AFFC.2011.37
10.1007/978-94-007-5207-8_2
10.1159/000119004
10.1207/S15326985EP3801_8
10.1109/MCE.2016.2590178
10.1109/T-AFFC.2011.15
10.1080/00223891.2010.497426
10.1080/02699939508408966
10.1109/ISSPA.2012.6310495
10.1007/978-3-319-46182-3_24
10.21437/Interspeech.2014-104
10.1109/TSMCA.2011.2116000
10.1155/2016/1601879
ContentType Journal Article
Copyright 2020. This work is licensed under http://creativecommons.org/licenses/by/3.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
2020 by the authors. 2020
Copyright_xml – notice: 2020. This work is licensed under http://creativecommons.org/licenses/by/3.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
– notice: 2020 by the authors. 2020
DBID AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
3V.
7X7
7XB
88E
8FI
8FJ
8FK
ABUWG
AFKRA
AZQEC
BENPR
CCPQU
DWQXO
FYUFA
GHDGH
K9.
M0S
M1P
PHGZM
PHGZT
PIMPY
PJZUB
PKEHL
PPXIY
PQEST
PQQKQ
PQUKI
PRINS
7X8
5PM
DOA
DOI 10.3390/s20082308
DatabaseName CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
ProQuest Central (Corporate)
Health & Medical Collection
ProQuest Central (purchase pre-March 2016)
Medical Database (Alumni Edition)
Hospital Premium Collection
Hospital Premium Collection (Alumni Edition)
ProQuest Central (Alumni) (purchase pre-March 2016)
ProQuest Central
ProQuest Central UK/Ireland
ProQuest Central Essentials
ProQuest Central
ProQuest One
ProQuest Central
Health Research Premium Collection
Health Research Premium Collection (Alumni)
ProQuest Health & Medical Complete (Alumni)
ProQuest Health & Medical Collection
Medical Database
ProQuest Central Premium
ProQuest One Academic (New)
Publicly Available Content Database
ProQuest Health & Medical Research Collection
ProQuest One Academic Middle East (New)
ProQuest One Health & Nursing
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
MEDLINE - Academic
PubMed Central (Full Participant titles)
DOAJ (Directory of Open Access Journals)
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest Health & Medical Complete (Alumni)
ProQuest Central (Alumni Edition)
ProQuest One Community College
ProQuest One Health & Nursing
ProQuest Central China
ProQuest Central
ProQuest Health & Medical Research Collection
Health Research Premium Collection
Health and Medicine Complete (Alumni Edition)
ProQuest Central Korea
Health & Medical Research Collection
ProQuest Central (New)
ProQuest Medical Library (Alumni)
ProQuest One Academic Eastern Edition
ProQuest Hospital Collection
Health Research Premium Collection (Alumni)
ProQuest Hospital Collection (Alumni)
ProQuest Health & Medical Complete
ProQuest Medical Library
ProQuest One Academic UKI Edition
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Central (Alumni)
MEDLINE - Academic
DatabaseTitleList CrossRef
Publicly Available Content Database
MEDLINE
MEDLINE - Academic


Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 3
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 4
  dbid: BENPR
  name: ProQuest Central
  url: https://www.proquest.com/central
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1424-8220
ExternalDocumentID oai_doaj_org_article_41f3a0674fdf4bb0a70d5d23ef5621f6
PMC7219061
32316626
10_3390_s20082308
Genre Journal Article
GrantInformation_xml – fundername: Ministerium für Wissenschaft, Forschung und Kunst Baden-Württemberg
  grantid: Margarete von Wrangell Habilitationsprogramm
– fundername: Deutsche Forschungsgemeinschaft
  grantid: SFB/TRR62
GroupedDBID ---
123
2WC
53G
5VS
7X7
88E
8FE
8FG
8FI
8FJ
AADQD
AAHBH
AAYXX
ABDBF
ABUWG
ACUHS
ADBBV
ADMLS
AENEX
AFKRA
AFZYC
ALIPV
ALMA_UNASSIGNED_HOLDINGS
BENPR
BPHCQ
BVXVI
CCPQU
CITATION
CS3
D1I
DU5
E3Z
EBD
ESX
F5P
FYUFA
GROUPED_DOAJ
GX1
HH5
HMCUK
HYE
KQ8
L6V
M1P
M48
MODMG
M~E
OK1
OVT
P2P
P62
PHGZM
PHGZT
PIMPY
PQQKQ
PROAC
PSQYO
RNS
RPM
TUS
UKHRP
XSB
~8M
CGR
CUY
CVF
ECM
EIF
NPM
PJZUB
PPXIY
3V.
7XB
8FK
AZQEC
DWQXO
K9.
PKEHL
PQEST
PQUKI
PRINS
7X8
5PM
PUEGO
ID FETCH-LOGICAL-c399t-e63d8e090cafac0a338848305fe8e6a23f0738f9c21f8e9a21dfe4e5dcff082e3
IEDL.DBID M48
ISSN 1424-8220
IngestDate Wed Aug 27 01:27:16 EDT 2025
Thu Aug 21 18:23:43 EDT 2025
Fri Jul 11 09:07:59 EDT 2025
Fri Jul 25 09:59:42 EDT 2025
Mon Jul 21 06:07:16 EDT 2025
Tue Jul 01 00:42:27 EDT 2025
Thu Apr 24 22:55:45 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 8
Keywords cognitive load
human-computer interaction
affective computing
interest
frustration
underload
multimodal sensors
affective corpus
stress research
machine learning
emotion recognition
overload
Language English
License https://creativecommons.org/licenses/by/4.0
Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c399t-e63d8e090cafac0a338848305fe8e6a23f0738f9c21f8e9a21dfe4e5dcff082e3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
These authors contributed equally to this work.
ORCID 0000-0001-5118-0812
OpenAccessLink http://journals.scholarsportal.info/openUrl.xqy?doi=10.3390/s20082308
PMID 32316626
PQID 2392900146
PQPubID 2032333
ParticipantIDs doaj_primary_oai_doaj_org_article_41f3a0674fdf4bb0a70d5d23ef5621f6
pubmedcentral_primary_oai_pubmedcentral_nih_gov_7219061
proquest_miscellaneous_2393574202
proquest_journals_2392900146
pubmed_primary_32316626
crossref_primary_10_3390_s20082308
crossref_citationtrail_10_3390_s20082308
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 20200417
PublicationDateYYYYMMDD 2020-04-17
PublicationDate_xml – month: 4
  year: 2020
  text: 20200417
  day: 17
PublicationDecade 2020
PublicationPlace Switzerland
PublicationPlace_xml – name: Switzerland
– name: Basel
PublicationTitle Sensors (Basel, Switzerland)
PublicationTitleAlternate Sensors (Basel)
PublicationYear 2020
Publisher MDPI AG
MDPI
Publisher_xml – name: MDPI AG
– name: MDPI
References Koelstra (ref_38) 2012; 3
ref_50
Cooper (ref_55) 2010; 92
ref_13
ref_12
ref_11
Gingell (ref_15) 2003; 33
Soleymani (ref_36) 2012; 3
Just (ref_18) 2003; 4
Lisetti (ref_46) 2004; 11
ref_52
Wierwille (ref_14) 1993; 35
ref_16
ref_59
Gross (ref_53) 2003; 85
Gross (ref_35) 1995; 9
ref_61
Heslenfeld (ref_17) 2009; 45
ref_60
Bauer (ref_26) 2015; 9
ref_25
ref_23
ref_67
ref_22
ref_66
Hudlicka (ref_40) 2003; 59
ref_65
Schwenker (ref_69) 2017; Volume 10183
ref_20
Jacobs (ref_56) 2015; 72
ref_64
Gosling (ref_57) 2003; 37
ref_62
Klingner (ref_27) 2011; 48
Greene (ref_10) 2016; 5
ref_29
ref_28
Luria (ref_21) 2012; 44
Kirschbaum (ref_70) 1993; 28
Chanel (ref_39) 2011; 41
ref_72
Schwenker (ref_68) 2017; Volume 10183
Sweller (ref_2) 1998; 10
Kurosu (ref_51) 2016; Volume 9732
ref_34
ref_33
ref_32
ref_31
ref_30
ref_73
Klein (ref_43) 2002; 14
Choi (ref_74) 2012; 16
Moos (ref_5) 2013; 22
Abler (ref_54) 2009; 55
Muck (ref_58) 2007; 23
Stephanidis (ref_37) 2015; Volume 528
ref_47
ref_45
ref_44
ref_41
ref_3
Silvia (ref_48) 2008; 17
Conati (ref_42) 2002; 16
ref_49
Sennersten (ref_19) 2020; 79
Kurosu (ref_4) 2016; Volume 9731
ref_9
Mattys (ref_24) 2011; 65
Thiam (ref_63) 2018; 48
Stroop (ref_71) 1935; 18
ref_7
Taylor (ref_8) 2013; 58
Paas (ref_1) 2003; 38
ref_6
References_xml – ident: ref_23
  doi: 10.1007/978-3-319-28109-4
– ident: ref_62
  doi: 10.1109/SSCI.2015.251
– volume: 48
  start-page: 709
  year: 2018
  ident: ref_63
  article-title: Temporal Dependency Based Multi-modal Active Learning Approach for Audiovisual Event Detection
  publication-title: Neural Process. Lett.
  doi: 10.1007/s11063-017-9719-y
– ident: ref_32
– volume: 9
  start-page: 1
  year: 2015
  ident: ref_26
  article-title: Estimating cognitive load during self-regulation of brain activity and neurofeedback with therapeutic brain-computer interfaces
  publication-title: Front. Behav. Neurosci.
  doi: 10.3389/fnbeh.2015.00021
– volume: Volume 528
  start-page: 110
  year: 2015
  ident: ref_37
  article-title: Emotion Elicitation Using Film Clips: Effect of Age Groups on Movie Choice and Emotion Rating
  publication-title: Human-Computer Interaction
– volume: 45
  start-page: 1212
  year: 2009
  ident: ref_17
  article-title: Tuning down the emotional brain: An fMRI study of the effects of cognitive load on the processing of affective images
  publication-title: NeuroImage
  doi: 10.1016/j.neuroimage.2009.01.016
– volume: 35
  start-page: 263
  year: 1993
  ident: ref_14
  article-title: Recommendations for mental workload measurement in a test and evaluation environment
  publication-title: Hum. Factors J. Hum. Factors Erg.Soc.
  doi: 10.1177/001872089303500205
– ident: ref_65
– ident: ref_49
  doi: 10.3102/978-0-935302-42-4_5
– volume: Volume 10183
  start-page: 12
  year: 2017
  ident: ref_69
  article-title: Fusion Architectures for Multimodal Cognitive Load Recognition
  publication-title: Multimodal Pattern Recognition of Social Signals in Human-Computer-Interaction (MPRSS 2016)
– volume: Volume 9731
  start-page: 77
  year: 2016
  ident: ref_4
  article-title: Is there a Biological Basis for Success in Human Companion Interaction?—Results from a Transsituational Study
  publication-title: Human-Computer Interaction—Theory, Design, Development and Practice
  doi: 10.1007/978-3-319-39510-4_8
– volume: 16
  start-page: 555
  year: 2002
  ident: ref_42
  article-title: Probabilistic assessment of user’s emotions in educational games
  publication-title: Appl. Ai.
– ident: ref_61
– volume: 59
  start-page: 1
  year: 2003
  ident: ref_40
  article-title: To feel or not to feel: The role of affect in human–computer interaction
  publication-title: Int. J. Hum.-Comput. Stud.
  doi: 10.1016/S1071-5819(03)00047-8
– volume: 14
  start-page: 119
  year: 2002
  ident: ref_43
  article-title: This computer responds to user frustration: Theory, design and results
  publication-title: Interact. Comput.
  doi: 10.1016/S0953-5438(01)00053-4
– ident: ref_64
  doi: 10.1145/2668056.2668062
– ident: ref_73
  doi: 10.1145/2663204.2663257
– volume: 58
  start-page: 175
  year: 2013
  ident: ref_8
  article-title: The view from the road: The contribution of on-road glance-monitoring technologies to understanding driver behavior
  publication-title: Accid. Anal. Prev.
  doi: 10.1016/j.aap.2013.02.008
– ident: ref_33
  doi: 10.1109/FUZZ-IEEE.2012.6250778
– ident: ref_52
– ident: ref_16
  doi: 10.1145/3173574.3174226
– ident: ref_28
  doi: 10.1109/CVPRW.2014.62
– ident: ref_9
  doi: 10.1145/1518701.1519035
– volume: 16
  start-page: 279
  year: 2012
  ident: ref_74
  article-title: Development and Evaluation of an Ambulatory Stress Monitor based on wearable sensors
  publication-title: IEEE Trans. Inf. Technol. Biomed.
  doi: 10.1109/TITB.2011.2169804
– volume: 48
  start-page: 323
  year: 2011
  ident: ref_27
  article-title: Effects of visual and verbal presentation on cognitive load in vigilance, memory and arithmetic tasks
  publication-title: J. Psychophys.
  doi: 10.1111/j.1469-8986.2010.01069.x
– ident: ref_41
– volume: 37
  start-page: 504
  year: 2003
  ident: ref_57
  article-title: A very brief measure of the big five personality domains
  publication-title: J. Res. Personal.
  doi: 10.1016/S0092-6566(03)00046-1
– volume: 23
  start-page: 23
  year: 2007
  ident: ref_58
  article-title: Construct validation of a short five-factor model instrument: A self-peer study on the German adaptation of the Ten-Item Personality Inventory (TIPI-G)
  publication-title: Eur. J. Psychol. Assess.
  doi: 10.1027/1015-5759.23.3.166
– ident: ref_13
– volume: 44
  start-page: 575
  year: 2012
  ident: ref_21
  article-title: A computerized multidimensional measurement of mental workload via handwriting analysis
  publication-title: Behav. Res. Methods.
  doi: 10.3758/s13428-011-0159-8
– ident: ref_45
– volume: 79
  start-page: 3145
  year: 2020
  ident: ref_19
  article-title: Modeling cognitive load and physiological arousal through pupil diameter and heart rate
  publication-title: Multimed. Tools Appl.
  doi: 10.1007/s11042-018-6518-z
– volume: 17
  start-page: 57
  year: 2008
  ident: ref_48
  article-title: Interest—The curious emotion
  publication-title: Curr. Dir. Psychol. Sci.
  doi: 10.1111/j.1467-8721.2008.00548.x
– ident: ref_20
– volume: 65
  start-page: 145
  year: 2011
  ident: ref_24
  article-title: Effects of cognitive load on speech recognition
  publication-title: J. Mem. Langg.
  doi: 10.1016/j.jml.2011.04.004
– ident: ref_59
– ident: ref_3
  doi: 10.1007/978-1-4419-8126-4
– ident: ref_25
  doi: 10.1109/BIBE.2013.6701686
– volume: 72
  start-page: 72
  year: 2015
  ident: ref_56
  article-title: The German TEIQue-SF: Factorial structure and relations to agentic and communal traits and mental health
  publication-title: Pers. Individ. Differ.
  doi: 10.1016/j.paid.2014.09.003
– volume: 18
  start-page: 643
  year: 1935
  ident: ref_71
  article-title: Studies of interference in serial verbal reactions
  publication-title: J. Exp. Psychol.
  doi: 10.1037/h0054651
– ident: ref_44
  doi: 10.1145/2750858.2805847
– ident: ref_7
– volume: 4
  start-page: 56
  year: 2003
  ident: ref_18
  article-title: Neuroindices of cognitive workload: Neuroimaging, pupillometric and event-related potential studies of brain work
  publication-title: Theor. Issues Ergon. Sci.
  doi: 10.1080/14639220210159735
– ident: ref_29
  doi: 10.1007/978-3-319-31700-7
– ident: ref_72
  doi: 10.1109/ACII.2013.105
– volume: 22
  start-page: 39
  year: 2013
  ident: ref_5
  article-title: Examining hypermedia learning: The role of cognitive load and self-regulated learning
  publication-title: J. Educ. Multimed. Hypermed.
– ident: ref_34
– volume: 11
  start-page: 1672
  year: 2004
  ident: ref_46
  article-title: Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals
  publication-title: Eurasip J. Adv. Signal. Process.
– ident: ref_47
– volume: Volume 9732
  start-page: 233
  year: 2016
  ident: ref_51
  article-title: In-depth analysis of multimodal interaction: An explorative paradigm
  publication-title: Human-Computer Interaction—Interaction Platforms and Techniques
  doi: 10.1007/978-3-319-39516-6_22
– volume: 85
  start-page: 348
  year: 2003
  ident: ref_53
  article-title: Individual differences in two emotion regulation processes: Implications for affect, relationships, and well-being
  publication-title: J. Personal. Soc. Psychol.
  doi: 10.1037/0022-3514.85.2.348
– ident: ref_11
– volume: 55
  start-page: 144
  year: 2009
  ident: ref_54
  article-title: Emotion Regulation Questionnaire—Eine deutschsprachige Fassung des ERQ von Gross und John (2003)
  publication-title: Diagnostica
  doi: 10.1026/0012-1924.55.3.144
– volume: 10
  start-page: 251
  year: 1998
  ident: ref_2
  article-title: Cognitive Architecture and Instructional Design
  publication-title: Educ. Psychol. Rev.
  doi: 10.1023/A:1022193728205
– volume: 3
  start-page: 211
  year: 2012
  ident: ref_36
  article-title: Multimodal emotion recognition in response to videos
  publication-title: IEEE Trans. Affect. Comput.
  doi: 10.1109/T-AFFC.2011.37
– ident: ref_6
  doi: 10.1007/978-94-007-5207-8_2
– ident: ref_67
– volume: 28
  start-page: 76
  year: 1993
  ident: ref_70
  article-title: The Trier Social Stress Test—A Tool for Investigating Psychobiological Stress Responses in a Laboratory Setting
  publication-title: Neuropsychobiology
  doi: 10.1159/000119004
– volume: 33
  start-page: 1
  year: 2003
  ident: ref_15
  article-title: Review of Workload Measurement, Analysis and Interpretation Methods
  publication-title: Eur. Organ. Saf. Air Navig.
– ident: ref_50
– volume: 38
  start-page: 63
  year: 2003
  ident: ref_1
  article-title: Cognitive load measurement as a means to advance cognitive load theory
  publication-title: Educ. Psychol.
  doi: 10.1207/S15326985EP3801_8
– volume: 5
  start-page: 44
  year: 2016
  ident: ref_10
  article-title: A Survey of Affective Computing for Stress Detection: Evaluating technologies in stress detection for better health
  publication-title: IEEE Consum. Electron. Mag.
  doi: 10.1109/MCE.2016.2590178
– volume: 3
  start-page: 18
  year: 2012
  ident: ref_38
  article-title: Deap: A database for emotion analysis using physiological signals
  publication-title: IEEE Trans. Affect. Comput.
  doi: 10.1109/T-AFFC.2011.15
– volume: 92
  start-page: 449
  year: 2010
  ident: ref_55
  article-title: A psychometric analysis of the Trait Emotional Intelligence Questionnaire-Short Form (TEIQue-SF) using item response theory
  publication-title: J. Personal. Assess.
  doi: 10.1080/00223891.2010.497426
– volume: Volume 10183
  start-page: 12
  year: 2017
  ident: ref_68
  article-title: Bimodal Recognition of Cognitive Load Based on Speech and Physiological Changes
  publication-title: Multimodal Pattern Recognition of Social Signals in Human-Computer Interaction (MPRSS 2016)
– ident: ref_12
– volume: 9
  start-page: 87
  year: 1995
  ident: ref_35
  article-title: Emotion elicitation using films
  publication-title: Cogn. Emot.
  doi: 10.1080/02699939508408966
– ident: ref_60
  doi: 10.1109/ISSPA.2012.6310495
– ident: ref_66
  doi: 10.1007/978-3-319-46182-3_24
– ident: ref_30
  doi: 10.21437/Interspeech.2014-104
– volume: 41
  start-page: 1052
  year: 2011
  ident: ref_39
  article-title: Emotion Assessment From Physiological Signals for Adaptation of Game Difficulty
  publication-title: IEEE Trans. Syst. Man Cybern. Part. A Syst. Hum.
  doi: 10.1109/TSMCA.2011.2116000
– ident: ref_31
  doi: 10.1155/2016/1601879
– ident: ref_22
SSID ssj0023338
Score 2.4041915
Snippet In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile...
SourceID doaj
pubmedcentral
proquest
pubmed
crossref
SourceType Open Website
Open Access Repository
Aggregation Database
Index Database
Enrichment Source
StartPage 2308
SubjectTerms Acquisitions & mergers
affective corpus
Behavior
Datasets
Emotions
Emotions - physiology
Experiments
frustration
Human performance
Human-computer interaction
Humans
interest
Machine Learning
multimodal sensors
overload
Pattern Recognition, Visual - physiology
Physiology
Sensors
underload
User-Computer Interface
Writing
SummonAdditionalLinks – databaseName: DOAJ (Directory of Open Access Journals)
  dbid: DOA
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV07T8MwELYQEwyIN4GCDGJgiZraTuqMpYAQEkwgdYtcP0RRmyLa7PwIfiG_hDsnjRpUiYU1tiLnHr777Mt3hFw6YaVWwoTaaREKoViosGh8GBubYk6hfDXh41Ny_yIeBvFgqdUX1oSV9MCl4Nqi47iCLVU448RwGKluZGLDuHUQuTvOk21DzFuAqQpqcUBeJY8QB1DfnrHyRkk2oo8n6V-VWf4ukFyKOHfbZKtKFWmvXOIOWbP5LtlcIhDcI2-gZVoU48ljr09v1FxhUPr-_OpR_2PtZGrwBb5kA3Y1irTFxYxCntp4iH0d4HV0lFN_ph8uWj1Qf15Y_vqwT17ubp_792HVPSHUkHTMQ5twI22URlo5pSMFEpFCgns7K22iGHfg3dKlGqQobapYxzgrbGy0cyAsyw_Iej7N7RGhoNAkFdwlUcoF-ClqlwGCFUnsjDRJQK4WUs10RS2OHS7GGUAMVEBWKyAgF_XU95JPY9Wka1RNPQEpsP0DMIysMozsL8MISGuh2Kzyy1nGMB1EWAjD5_UweBRek6jcTgs_h8ddwSIWkMPSDuqVcEiHE8CAAek2LKSx1OZIPnr1rN0AtVNIno7_49tOyAZD3I-ck90WWZ9_FPYUkqP58Mz7wQ-JzBA6
  priority: 102
  providerName: Directory of Open Access Journals
– databaseName: ProQuest Technology Collection
  dbid: 8FG
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lb9QwEB5BucABtTxDHzKIA5eoie14nVO1fSwVUjlRqbfI6we0apPS3dz5EfzC_pLOONm0QRVX27Isz4znG3v8DcDnIL22RrrUBitTKQ1PDSWNzwvnS8IUJmYTnnxXx6fy21lx1l-4Lfq0ytWZGA9q11i6I9_l5MgJ0Ku9698pVY2i19W-hMZTeJajp6GULj37OgRcAuOvjk1IYGi_u-Ddu5Ie-aBI1f8Yvvw3TfKB35mtw8seMLJpJ-ENeOLrV_DiAY3ga7hAWbO2vbw6mR6wQ7M05Jpu__ydsvi99qpxNEFM3MCzjRF5cbtgiFZHjVTdAadj5zWLN_vpquADi7eG3QeIN3A6O_pxcJz2NRRSi9BjmXolnPZZmVkTjM0M7oiWGo08eO2V4SKgjetQWp4H7UvDcxe89IWzIeBmefEW1uqm9u-BoVhVKUVQWSkkWivJmGMcK1URnHYqgS-rXa1sTzBOdS4uKww0SADVIIAEPg1DrztWjccG7ZNohgFEhB0bmpufVW9XlcyDMOhxZXBBzueZmWSucFz4gMAuD7iorZVgq946F9W9LiXwcehGu6LHElP7po1jRDGRPOMJvOv0YFiJQFCsMBJMYDLSkNFSxz31-a_I3Y0Bd4kQ6sP_l7UJzznF9cQpOdmCteVN67cR_CznO1HD7wDgaQdM
  priority: 102
  providerName: ProQuest
Title The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction
URI https://www.ncbi.nlm.nih.gov/pubmed/32316626
https://www.proquest.com/docview/2392900146
https://www.proquest.com/docview/2393574202
https://pubmed.ncbi.nlm.nih.gov/PMC7219061
https://doaj.org/article/41f3a0674fdf4bb0a70d5d23ef5621f6
Volume 20
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lj9MwEB7t4wIHxJvCUhnEgUsgtZ3EOSDUXbaskLpCiEq9Ra4fy6JuCm0jwY0fwS_klzDjPLRBFZcc4lFkeTyZ7_PjG4AXXjpltLSR8UZGUmoeaTo0vkisywlT6HCacHqens3kh3ky34O2xmYzgJud1I7qSc3Wy1c_vv98iwH_hhgnUvbXG17vF6l9OMSElFEhg6nsNhO4QBpWiwr1zXupKCj274KZ_56WvJZ-JrfhVoMb2bh29B3Yc-VduHlNTfAefEWXs6paXk3HJ-yd3mrKUH9-_R6zcMv2amXpA-H8Bv7iGGkYVxuGoLX3koo84OfYZcnCAn_U1n1gYfGwvgdxH2aT088nZ1FTSiEyiEC2kUuFVS7OY6O9NrHGEVFSYax7p1yqufAY6srnho-8crnmI-uddIk13uNgOfEADspV6R4BQ--muRQ-jXMhMWjJ1RzprEwTb5VNB_CyHdXCNDrjVO5iWSDfIAcUnQMG8Lwz_VaLa-wyOibXdAakhx1erNYXRRNehRx5oTHxSm-9XCxincU2sVw4j_hu5LFTR61ji3aOFZywIXFEbH7WNWN40Z6JLt2qCjYiySSP-QAe1vOg64lAbJwiIRxA1pshva72W8rLL0HCG3l3jkjq8f-79QRucKL3JC2ZHcHBdl25p4iBtosh7GfzDJ9q8n4Ih8en5x8_DcN6wjDM_b9www0S
linkProvider Scholars Portal
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3NbtQwEB6VcgAOqPyVQAGDQOISNWs72eRQoaVl2dJuT63UW_D6B4rapHQ3Qtx4iD5HH4onYcb5oUEVt15ty7I8n2e-scczAK-dtKlW0oTaaRlKqXioKGh8FhubEadQPppwupdMDuSnw_hwCS7avzAUVtnqRK-oTanpjnydkyEnQp-8O_0eUtUoel1tS2jUsNixP3-gyzbf2N5C-b7hfPxhf3MSNlUFQo3GeBHaRJjURlmklVM6UuijpTJF2Dub2kRx4RD1qcs0H7jUZooPjLPSxkY7h_bSCpz3BtyUAi05_Uwff-wcPIFz1dmLsDNan_P6HSvt2TxfGuAqPvtvWOYlOzdegbsNQWWjGlH3YMkW9-HOpbSFD-AbYotV1fHJdLTJttRCkSn8_et8xPx33pPS0AQ-UAR1KaNkydWcITvuNVI1CZyOHRXMvySEbYEJ5m8p6w8XD-HgWnb3ESwXZWEfA0MYJZkULokyIVE7EKY4-s0yiZ1JTRLA23ZXc90kNKe6Gsc5OjYkgLwTQACvuqGndRaPqwa9J9F0Ayjxtm8oz77kzTnO5cAJhRZeOuPkbBapYWRiw4V1SCQHDhe11go2b7TBPP-L3QBedt14julxRhW2rPwYEQ8lj3gAqzUOupUIJOEJep4BDHsI6S2131McffW5wtHBz5CyPfn_sl7Arcn-dDff3d7beQq3Od0pUD7L4RosL84q-wyJ12L23KOdwefrPl5_APJORaI
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3NbtQwEB6VrYToAfFPoIBBIHGJNms7fweEtt2uWkpXFaJSb8Eb27SoTUp3I8SNh-BpeByehBnnhwZV3HqNLcvKfDPzjT2eAXhppUlyJbWf21z6UiruK0oan4fapMQplMsm3JtF2wfy3WF4uAK_2rcwlFbZ2kRnqHWZ0xn5kJMjJ0IfDW2TFrE_mb49--pTBym6aW3badQQ2TXfv2H4tnizM0FZv-J8uvVxc9tvOgz4OTrmpW8ioRMTpEGurMoDhfFaIhNUAWsSEykuLGpAYtOcj2xiUsVH2hppQp1bi77TCFz3GqzGFBUNYHVja7b_oQv3BK5W1zISIg2GC17faiU9D-gaBVzGbv9N0rzg9aa34GZDV9m4xtdtWDHFHVi7UMTwLnxBpLGqOjndG2-yiVoqcoy_f_wcM_e497TUtIBLG0HLyqh0crVgyJV7H6m3BC7Hjgvm7hX8tt0Ec2eW9fOLe3BwJf_3PgyKsjAPgSGoolQKGwWpkGgrCGEco2gZhVYnOvLgdftXs7wpb05dNk4yDHNIAFknAA9edFPP6poel03aINF0E6gMt_tQnn_OGq3O5MgKhf5eWm3lfB6oONCh5sJYpJUji5tabwWbNbZhkf1FsgfPu2HUarqqUYUpKzdHhLHkAffgQY2DbicCKXmEcagHcQ8hva32R4rjI1c5HMP9FAnco_9v6xlcR9XK3u_Mdh_DDU4HDFTcMl6HwfK8Mk-QhS3nTxu4M_h01Rr2BzEqSzQ
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=The+uulmMAC+Database%E2%80%94A+Multimodal+Affective+Corpus+for+Affective+Computing+in+Human-Computer+Interaction&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=Hazer-Rau%2C+Dilana&rft.au=Meudt%2C+Sascha&rft.au=Daucher%2C+Andreas&rft.au=Spohrs%2C+Jennifer&rft.date=2020-04-17&rft.pub=MDPI+AG&rft.eissn=1424-8220&rft.volume=20&rft.issue=8&rft.spage=2308&rft_id=info:doi/10.3390%2Fs20082308&rft.externalDBID=HAS_PDF_LINK
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon