A Large Finer-grained Affective Computing EEG Dataset
Affective computing based on electroencephalogram (EEG) has gained increasing attention for its objectivity in measuring emotional states. While positive emotions play a crucial role in various real-world applications, such as human-computer interactions, the state-of-the-art EEG datasets have prima...
Saved in:
Published in | Scientific data Vol. 10; no. 1; pp. 740 - 10 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
London
Nature Publishing Group UK
25.10.2023
Nature Publishing Group Nature Portfolio |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Affective computing based on electroencephalogram (EEG) has gained increasing attention for its objectivity in measuring emotional states. While positive emotions play a crucial role in various real-world applications, such as human-computer interactions, the state-of-the-art EEG datasets have primarily focused on negative emotions, with less consideration given to positive emotions. Meanwhile, these datasets usually have a relatively small sample size, limiting exploration of the important issue of cross-subject affective computing. The proposed Finer-grained Affective Computing EEG Dataset (FACED) aimed to address these issues by recording 32-channel EEG signals from 123 subjects. During the experiment, subjects watched 28 emotion-elicitation video clips covering nine emotion categories (amusement, inspiration, joy, tenderness; anger, fear, disgust, sadness, and neutral emotion), providing a fine-grained and balanced categorization on both the positive and negative sides of emotion. The validation results show that emotion categories can be effectively recognized based on EEG signals at both the intra-subject and the cross-subject levels. The FACED dataset is expected to contribute to developing EEG-based affective computing algorithms for real-world applications. |
---|---|
AbstractList | Affective computing based on electroencephalogram (EEG) has gained increasing attention for its objectivity in measuring emotional states. While positive emotions play a crucial role in various real-world applications, such as human-computer interactions, the state-of-the-art EEG datasets have primarily focused on negative emotions, with less consideration given to positive emotions. Meanwhile, these datasets usually have a relatively small sample size, limiting exploration of the important issue of cross-subject affective computing. The proposed Finer-grained Affective Computing EEG Dataset (FACED) aimed to address these issues by recording 32-channel EEG signals from 123 subjects. During the experiment, subjects watched 28 emotion-elicitation video clips covering nine emotion categories (amusement, inspiration, joy, tenderness; anger, fear, disgust, sadness, and neutral emotion), providing a fine-grained and balanced categorization on both the positive and negative sides of emotion. The validation results show that emotion categories can be effectively recognized based on EEG signals at both the intra-subject and the cross-subject levels. The FACED dataset is expected to contribute to developing EEG-based affective computing algorithms for real-world applications. Affective computing based on electroencephalogram (EEG) has gained increasing attention for its objectivity in measuring emotional states. While positive emotions play a crucial role in various real-world applications, such as human-computer interactions, the state-of-the-art EEG datasets have primarily focused on negative emotions, with less consideration given to positive emotions. Meanwhile, these datasets usually have a relatively small sample size, limiting exploration of the important issue of cross-subject affective computing. The proposed Finer-grained Affective Computing EEG Dataset (FACED) aimed to address these issues by recording 32-channel EEG signals from 123 subjects. During the experiment, subjects watched 28 emotion-elicitation video clips covering nine emotion categories (amusement, inspiration, joy, tenderness; anger, fear, disgust, sadness, and neutral emotion), providing a fine-grained and balanced categorization on both the positive and negative sides of emotion. The validation results show that emotion categories can be effectively recognized based on EEG signals at both the intra-subject and the cross-subject levels. The FACED dataset is expected to contribute to developing EEG-based affective computing algorithms for real-world applications.Affective computing based on electroencephalogram (EEG) has gained increasing attention for its objectivity in measuring emotional states. While positive emotions play a crucial role in various real-world applications, such as human-computer interactions, the state-of-the-art EEG datasets have primarily focused on negative emotions, with less consideration given to positive emotions. Meanwhile, these datasets usually have a relatively small sample size, limiting exploration of the important issue of cross-subject affective computing. The proposed Finer-grained Affective Computing EEG Dataset (FACED) aimed to address these issues by recording 32-channel EEG signals from 123 subjects. During the experiment, subjects watched 28 emotion-elicitation video clips covering nine emotion categories (amusement, inspiration, joy, tenderness; anger, fear, disgust, sadness, and neutral emotion), providing a fine-grained and balanced categorization on both the positive and negative sides of emotion. The validation results show that emotion categories can be effectively recognized based on EEG signals at both the intra-subject and the cross-subject levels. The FACED dataset is expected to contribute to developing EEG-based affective computing algorithms for real-world applications. Abstract Affective computing based on electroencephalogram (EEG) has gained increasing attention for its objectivity in measuring emotional states. While positive emotions play a crucial role in various real-world applications, such as human-computer interactions, the state-of-the-art EEG datasets have primarily focused on negative emotions, with less consideration given to positive emotions. Meanwhile, these datasets usually have a relatively small sample size, limiting exploration of the important issue of cross-subject affective computing. The proposed Finer-grained Affective Computing EEG Dataset (FACED) aimed to address these issues by recording 32-channel EEG signals from 123 subjects. During the experiment, subjects watched 28 emotion-elicitation video clips covering nine emotion categories (amusement, inspiration, joy, tenderness; anger, fear, disgust, sadness, and neutral emotion), providing a fine-grained and balanced categorization on both the positive and negative sides of emotion. The validation results show that emotion categories can be effectively recognized based on EEG signals at both the intra-subject and the cross-subject levels. The FACED dataset is expected to contribute to developing EEG-based affective computing algorithms for real-world applications. |
ArticleNumber | 740 |
Author | Huang, Chen Chen, Jingjing Zhang, Dan Wang, Xiaobin Shen, Xinke Hu, Xin |
Author_xml | – sequence: 1 givenname: Jingjing surname: Chen fullname: Chen, Jingjing organization: Dept. of Psychology, School of Social Sciences, Tsinghua University, Tsinghua Laboratory of Brain and Intelligence, Tsinghua University – sequence: 2 givenname: Xiaobin surname: Wang fullname: Wang, Xiaobin organization: Dept. of Psychology, School of Social Sciences, Tsinghua University, Tsinghua Laboratory of Brain and Intelligence, Tsinghua University – sequence: 3 givenname: Chen surname: Huang fullname: Huang, Chen organization: Dept. of Psychology, School of Social Sciences, Tsinghua University, Tsinghua Laboratory of Brain and Intelligence, Tsinghua University – sequence: 4 givenname: Xin surname: Hu fullname: Hu, Xin organization: Dept. of Psychology, School of Social Sciences, Tsinghua University, Dept. of Psychiatry, School of Medicine, University of Pittsburgh – sequence: 5 givenname: Xinke surname: Shen fullname: Shen, Xinke organization: Tsinghua Laboratory of Brain and Intelligence, Tsinghua University, Dept. of Biomedical Engineering, School of Medicine, Tsinghua University – sequence: 6 givenname: Dan orcidid: 0000-0002-7592-3200 surname: Zhang fullname: Zhang, Dan email: dzhang@tsinghua.edu.cn organization: Dept. of Psychology, School of Social Sciences, Tsinghua University, Tsinghua Laboratory of Brain and Intelligence, Tsinghua University |
BookMark | eNp9kU9v3CAQxVGVqvnTfIGeLPXSi9MBjI1P1Wq7SSOt1Et7RiwMLiuv2QJOlG9fNhu1TQ45oEHw3m9G887JyRQmJOQDhSsKXH5ODRV9VwPj5bQC6vs35IyBYHXTtPzkv_spuUxpCwCUNyA6eEdOeSdlcbVnRCyqtY4DVtd-wlgPUZdqq4VzaLK_w2oZdvs5-2moVqub6qvOOmF-T946PSa8fKoX5Of16sfyW73-fnO7XKxrI6jMNXZ9h0yiLW0ZtBw4pdxo6SjvN720yDdgup4KiR2zVoAD43TDnQPr-qbhF-T2yLVBb9U--p2ODyporx4fQhyUjtmbEVUnHUcUbQtImwKXzmljoBeoN7bntrC-HFn7ebNDa3DKUY_PoM9_Jv9LDeFOUWgBWMMK4dMTIYbfM6asdj4ZHEc9YZiTYlIyzjgwWqQfX0i3YY5T2dVBRUXTl40UFTuqTAwpRXR_p6GgDimrY8qqpKweU1b3xSRfmIzPOvtwmNqPr1v50ZpKn2nA-G-qV1x_AC1Duw4 |
CitedBy_id | crossref_primary_10_1016_j_procs_2024_08_032 crossref_primary_10_1016_j_neuroimage_2024_120890 crossref_primary_10_1109_TIM_2024_3398103 crossref_primary_10_1016_j_bspc_2024_106249 crossref_primary_10_1007_s11571_024_10186_x crossref_primary_10_1016_j_bspc_2025_107536 crossref_primary_10_1109_JBHI_2024_3395622 crossref_primary_10_1016_j_bspc_2025_107511 crossref_primary_10_1038_s41597_024_04102_5 crossref_primary_10_1109_JBHI_2024_3384816 crossref_primary_10_1016_j_knosys_2025_113018 crossref_primary_10_1080_10255842_2024_2417212 crossref_primary_10_1109_TIM_2024_3472838 crossref_primary_10_1109_TAFFC_2024_3433470 |
Cites_doi | 10.1371/journal.pone.0256211 10.7551/mitpress/1140.001.0001 10.1109/ACCESS.2019.2891579 10.1109/TAFFC.2017.2660485 10.1080/02699939208411068 10.1109/TAMD.2015.2431497 10.1088/1741-2552/ab260c 10.1109/ACII.2015.7344594 10.1109/TAFFC.2022.3164516 10.1109/ICME.2014.6890166 10.1109/T-AFFC.2011.15 10.1109/NER.2013.6695876 10.1037/1528-3542.7.4.715 10.1016/j.compedu.2019.103649 10.1016/B978-0-12-407236-7.00001-2 10.3389/fnhum.2017.00026 10.1016/j.neuroimage.2021.118819 10.1109/T-AFFC.2011.25 10.1109/TAFFC.2022.3170369 10.1037/0003-066X.56.3.218 10.1088/1741-2560/11/4/046018 10.1037/a0019015 10.3389/fnins.2020.00627 10.1609/aaai.v35i1.16169 10.1007/978-3-319-19387-8_288 10.1109/TITB.2010.2041553 10.3389/fnhum.2019.00120 10.3389/fnins.2018.00162 10.1177/2096595819896200 10.7303/syn50614194 10.1109/TAFFC.2018.2849758 10.1038/s41597-022-01262-0 10.1080/02699931.2018.1530197 10.1080/02699930903274322 10.1145/3232078.3232239 10.3389/fnins.2013.00267 10.1371/journal.pone.0145450 10.1109/TAFFC.2017.2714671 10.1016/j.neuroimage.2018.01.035 10.1016/j.neuroimage.2019.02.057 |
ContentType | Journal Article |
Copyright | The Author(s) 2023 The Author(s) 2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. 2023. Springer Nature Limited. Springer Nature Limited 2023 |
Copyright_xml | – notice: The Author(s) 2023 – notice: The Author(s) 2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: 2023. Springer Nature Limited. – notice: Springer Nature Limited 2023 |
DBID | C6C AAYXX CITATION 3V. 7X7 7XB 88E 8FE 8FH 8FI 8FJ 8FK ABUWG AFKRA AZQEC BBNVY BENPR BHPHI CCPQU DWQXO FYUFA GHDGH GNUQQ HCIFZ K9. LK8 M0S M1P M7P PHGZM PHGZT PIMPY PJZUB PKEHL PPXIY PQEST PQGLB PQQKQ PQUKI PRINS 7X8 5PM DOA |
DOI | 10.1038/s41597-023-02650-w |
DatabaseName | Springer Nature OA Free Journals CrossRef ProQuest Central (Corporate) ProQuest Health & Medical Collection ProQuest Central (purchase pre-March 2016) Medical Database (Alumni Edition) ProQuest SciTech Collection ProQuest Natural Science Collection Hospital Premium Collection Hospital Premium Collection (Alumni Edition) ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest Central UK/Ireland ProQuest Central Essentials Biological Science Collection ProQuest Central Natural Science Collection ProQuest One ProQuest Central Korea Health Research Premium Collection Health Research Premium Collection (Alumni) ProQuest Central Student SciTech Premium Collection ProQuest Health & Medical Complete (Alumni) Biological Sciences Health & Medical Collection (Alumni) Medical Database Biological Science Database ProQuest Central Premium ProQuest One Academic Publicly Available Content Database ProQuest Health & Medical Research Collection ProQuest One Academic Middle East (New) ProQuest One Health & Nursing ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef Publicly Available Content Database ProQuest Central Student ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest Health & Medical Complete (Alumni) ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest One Health & Nursing ProQuest Natural Science Collection ProQuest Central China ProQuest Central ProQuest One Applied & Life Sciences ProQuest Health & Medical Research Collection Health Research Premium Collection Health and Medicine Complete (Alumni Edition) Natural Science Collection ProQuest Central Korea Health & Medical Research Collection Biological Science Collection ProQuest Central (New) ProQuest Medical Library (Alumni) ProQuest Biological Science Collection ProQuest One Academic Eastern Edition ProQuest Hospital Collection Health Research Premium Collection (Alumni) Biological Science Database ProQuest SciTech Collection ProQuest Hospital Collection (Alumni) ProQuest Health & Medical Complete ProQuest Medical Library ProQuest One Academic UKI Edition ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) MEDLINE - Academic |
DatabaseTitleList | Publicly Available Content Database MEDLINE - Academic CrossRef |
Database_xml | – sequence: 1 dbid: C6C name: Springer Nature OA Free Journals url: http://www.springeropen.com/ sourceTypes: Publisher – sequence: 2 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 3 dbid: BENPR name: ProQuest Central url: https://www.proquest.com/central sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Sciences (General) |
EISSN | 2052-4463 |
EndPage | 10 |
ExternalDocumentID | oai_doaj_org_article_78f3ee5660e14b988ffacc095eabd93d PMC10600242 10_1038_s41597_023_02650_w |
GrantInformation_xml | – fundername: Tsinghua University Spring Breeze Fund (2021Z99CFY037) – fundername: National Natural Science Foundation of China (National Science Foundation of China) grantid: 61977041; 62107025 funderid: https://doi.org/10.13039/501100001809 – fundername: ; – fundername: ; grantid: 61977041; 62107025 |
GroupedDBID | 0R~ 3V. 53G 5VS 7X7 88E 8FE 8FH 8FI 8FJ AAJSJ ABUWG ACGFS ACSFO ACSMW ADBBV ADRAZ AFKRA AGHDO AJTQC ALIPV ALMA_UNASSIGNED_HOLDINGS AOIJS BBNVY BCNDV BENPR BHPHI BPHCQ BVXVI C6C CCPQU DIK EBLON EBS EJD FYUFA GROUPED_DOAJ HCIFZ HMCUK HYE KQ8 LK8 M1P M48 M7P M~E NAO OK1 PGMZT PIMPY PQQKQ PROAC PSQYO RNT RNTTT RPM SNYQT UKHRP AASML AAYXX CITATION PHGZM PHGZT 7XB 8FK AARCD AZQEC DWQXO GNUQQ K9. PJZUB PKEHL PPXIY PQEST PQGLB PQUKI PRINS 7X8 5PM PUEGO |
ID | FETCH-LOGICAL-c518t-e797e28ed057206303113ca8f139b98de3b0c79158e72dd50f0cfa43ff0df9443 |
IEDL.DBID | M48 |
ISSN | 2052-4463 |
IngestDate | Wed Aug 27 01:32:39 EDT 2025 Thu Aug 21 18:35:47 EDT 2025 Fri Jul 11 00:14:17 EDT 2025 Wed Aug 13 08:19:31 EDT 2025 Tue Jul 01 00:39:00 EDT 2025 Thu Apr 24 22:53:04 EDT 2025 Fri Feb 21 02:39:03 EST 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 1 |
Language | English |
License | Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c518t-e797e28ed057206303113ca8f139b98de3b0c79158e72dd50f0cfa43ff0df9443 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 ObjectType-Article-2 ObjectType-Undefined-1 ObjectType-Feature-3 content type line 23 |
ORCID | 0000-0002-7592-3200 |
OpenAccessLink | http://journals.scholarsportal.info/openUrl.xqy?doi=10.1038/s41597-023-02650-w |
PMID | 37880266 |
PQID | 2881549572 |
PQPubID | 2041912 |
PageCount | 10 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_78f3ee5660e14b988ffacc095eabd93d pubmedcentral_primary_oai_pubmedcentral_nih_gov_10600242 proquest_miscellaneous_2882323021 proquest_journals_2881549572 crossref_primary_10_1038_s41597_023_02650_w crossref_citationtrail_10_1038_s41597_023_02650_w springer_journals_10_1038_s41597_023_02650_w |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2023-10-25 |
PublicationDateYYYYMMDD | 2023-10-25 |
PublicationDate_xml | – month: 10 year: 2023 text: 2023-10-25 day: 25 |
PublicationDecade | 2020 |
PublicationPlace | London |
PublicationPlace_xml | – name: London |
PublicationTitle | Scientific data |
PublicationTitleAbbrev | Sci Data |
PublicationYear | 2023 |
Publisher | Nature Publishing Group UK Nature Publishing Group Nature Portfolio |
Publisher_xml | – name: Nature Publishing Group UK – name: Nature Publishing Group – name: Nature Portfolio |
References | de CheveignéAArzounianDRobust detrending, rereferencing, outlier detection, and inpainting for multichannel dataNeuroImage201817290391210.1016/j.neuroimage.2018.01.03529448077 TrampeDQuoidbachJTaquetMEmotions in everyday lifePloS One201510e014545010.1371/journal.pone.0145450266981244689475 Kleiner, M., Brainard, D. & Pelli, D. What’s new in Psychtoolbox-3? (2007). AlarcaoSMFonsecaMJEmotions recognition using EEG signals: A surveyIEEE Trans. Affect Comput.20171037439310.1109/TAFFC.2017.2714671 KoelstraSDeap: A database for emotion analysis; using physiological signalsIEEE Trans. Affect. Comput.20113183110.1109/T-AFFC.2011.15 Duan, R.-N., Zhu, J.-Y. & Lu, B.-L. Differential entropy feature for EEG-based emotion classification. in 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER) 81–84 (IEEE, 2013). Liu B, Huang X, Wang Y, Chen X, Gao X. BETA: A large benchmark database toward SSVEP-BCI application. Front. Neurosci. 14 (2020). Zheng, W.-L. & Lu, B.-L. Personalizing EEG-based affective models with transfer learning. in Proceedings of the twenty-fifth International Joint Conference on Artificial Intelligence 2732–2738 (2016). HuXChenJWangFZhangDTen challenges for EEG-based affective computingBrain Sci. Adv.201951202019SciA....5....1H10.1177/2096595819896200 Zhu, J.-Y., Zheng, W.-L. & Lu, B.-L. Cross-subject and Cross-gender Emotion Classification from EEG. in World Congress on Medical Physics and Biomedical Engineering, June 7-12, 2015, Toronto, Canada (ed. Jaffray, D. A.) vol. 51 1188–1191 (Springer International Publishing, 2015). HuXWangFZhangDSimilar brains blend emotion in similar ways: Neural representations of individual difference in emotion profilesNeuroimage202224711881910.1016/j.neuroimage.2021.11881934920085 LiXExploring EEG Features in Cross-Subject Emotion RecognitionFront. Neurosci.20181216210.3389/fnins.2018.00162296158535867345 SchaeferANilsFSanchezXPhilippotPAssessing the effectiveness of a large database of emotion-eliciting films: A new tool for emotion researchersCogn. Emot.2010241153117210.1080/02699930903274322 DillenLVKooleSLClearing the mind: A working memory model of distraction fromnegative feelingsEmotion2007771510.1037/1528-3542.7.4.71518039038 RoyYDeep learning-based electroencephalography analysis: a systematic reviewJ. Neural Eng.2019160510012019JNEng..16e1001R10.1088/1741-2552/ab260c31151119 HairstonWDUsability of four commercially-oriented EEG systemsJ. Neural Eng.2014110460182014JNEng..11d6018H10.1088/1741-2560/11/4/046018 ZhaoL-MYanXLuB-LPlug-and-play domain adaptation for cross-subject EEG-based emotion recognitionProceedings of the AAAI Conference on Artificial Intelligence20213586387010.1609/aaai.v35i1.16169 Zhao, G., Zhang, Y., Zhang, G., Zhang, D. & Liu, Y.-J. Multi-target positive emotion recognition from EEG signals. IEEE Trans. Affect Comput. (2020). ZhengW-LLuB-LInvestigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE TransAuton Ment Dev.2015716217510.1109/TAMD.2015.2431497 SoleymaniMLichtenauerJPunTPanticMA multimodal database for affect recognition and implicit taggingIEEE Trans. Affect Comput.20113425510.1109/T-AFFC.2011.25 FrantzidisCAToward emotion aware computing: an integrated approach using multichannel neurophysiological recordings and affective visual stimuliIEEE T INF TECHNOL B.20101458959710.1109/TITB.2010.2041553 SongTMPED: A multi-modal physiological emotion database for discrete emotion recognitionIEEE Access20197121771219110.1109/ACCESS.2019.2891579 HuXfNIRS evidence for recognizably different positive emotionsFront. Hum. Neurosci.20191312010.3389/fnhum.2019.00120310242786465574 GeYZhaoGZhangYHoustonRJSongJA standardised database of Chinese emotional film clipsCogn. Emot.20193397699010.1080/02699931.2018.153019730293475 DingYHuXXiaZLiuY-JZhangDInter-Brain EEG Feature Extraction and Analysis for Continuous Implicit Emotion Tagging During Video WatchingIEEE Trans. Affect Comput.2021129210210.1109/TAFFC.2018.2849758 Fredrickson, B. L. Positive emotions broaden and build. in Advances in Experimental Social Psychology vol. 47 1–53 (Elsevier Press, 2013). Shen, X., Liu, X., Hu, X., Zhang, D. & Song, S. Contrastive learning of subject-invariant EEG representations for cross-subject emotion recognition. IEEE Trans. Affect Comput. (2022). TianFHuaMZhangWLiYYangXEmotional arousal in 2D versus 3D virtual reality environmentsPloS One202116e02562111:CAS:528:DC%2BB3MXitVekurrE10.1371/journal.pone.0256211344996678428725 EkmanPAn argument for basic emotionsCogn. Emot.1992616920010.1080/02699939208411068 RayRDMcRaeKOchsnerKNGrossJJCognitive reappraisal of negative affect: converging evidence from EMG and self-reportEmotion20101058710.1037/a0019015206778754106258 Nijholt, A. From word play to world play: introducing humor in human-computer interaction. in Proceedings of the 36th European Conference on Cognitive Ergonomics 1–8 (2018). SaganowskiSEmognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearablesSci. Data.2022911110.1038/s41597-022-01262-0 HuXEEG correlates of ten positive emotionsFront. Hum. Neurosci.2017112610.3389/fnhum.2017.00026281841945266691 ChenJA large finer-grained affective computing EEG dataset202310.7303/syn50614194Synapse ScheinostDTen simple rules for predictive modeling of individual differences in neuroimagingNeuroImage2019193354510.1016/j.neuroimage.2019.02.05730831310 Zheng, W.-L., Zhu, J.-Y., Peng, Y. & Lu, B.-L. EEG-based emotion classification using deep belief networks. in 2014 IEEE International Conference on Multimedia and Expo (ICME) 1–6 (IEEE, 2014). FredricksonBLThe role of positive emotions in positive psychology: The broaden-and-build theory of positive emotionsAm Psychol.2001562181:STN:280:DC%2BD3Mzot1ygsg%3D%3D10.1037/0003-066X.56.3.218113152483122271 Gramfort, A. et al. MEG and EEG data analysis with MNE-Python. Front. Neurosci. 267 (2013). YoonJPohlmeyerAEDesmetPWhen’feeling good’is not good enough: seven key opportunities for emotional granularity in product developmentInt. J. Des.201610115 Devillers, L. et al. Multimodal data collection of human-robot humorous interactions in the joker project. in 2015 International Conference on Affective Computing and Intelligent Interaction (ACII) 348–354 (IEEE, 2015). Picard, R. W. Affective computing. (MIT press, 2000). YadegaridehkordiENoorNFBMAyubMNBAffalHBHussinNBAffective computing in education: A systematic review and future researchComput. Educ.201914210364910.1016/j.compedu.2019.103649 Zhang, Z., Zhong, S. & Liu, Y. GANSER: A Self-supervised Data Augmentation Framework for EEG-based Emotion Recognition. IEEE Trans. Affect Comput. (2022). LiuY-JReal-Time Movie-Induced Discrete Emotion Recognition from EEG SignalsIEEE Trans. Affect Comput.2018955056210.1109/TAFFC.2017.2660485 Y Ding (2650_CR5) 2021; 12 Y Ge (2650_CR30) 2019; 33 D Trampe (2650_CR10) 2015; 10 W-L Zheng (2650_CR22) 2015; 7 2650_CR44 2650_CR41 T Song (2650_CR20) 2019; 7 LV Dillen (2650_CR33) 2007; 7 2650_CR4 X Hu (2650_CR31) 2022; 247 M Soleymani (2650_CR35) 2011; 3 2650_CR1 A Schaefer (2650_CR29) 2010; 24 X Hu (2650_CR13) 2019; 13 P Ekman (2650_CR9) 1992; 6 2650_CR25 2650_CR26 CA Frantzidis (2650_CR42) 2010; 14 E Yadegaridehkordi (2650_CR2) 2019; 142 J Chen (2650_CR43) 2023 X Hu (2650_CR6) 2019; 5 2650_CR27 2650_CR28 A de Cheveigné (2650_CR40) 2018; 172 X Hu (2650_CR8) 2017; 11 2650_CR11 RD Ray (2650_CR32) 2010; 10 BL Fredrickson (2650_CR12) 2001; 56 SM Alarcao (2650_CR3) 2017; 10 Y Roy (2650_CR24) 2019; 16 WD Hairston (2650_CR37) 2014; 11 S Koelstra (2650_CR21) 2011; 3 J Yoon (2650_CR7) 2016; 10 L-M Zhao (2650_CR16) 2021; 35 D Scheinost (2650_CR23) 2019; 193 Y-J Liu (2650_CR15) 2018; 9 2650_CR14 F Tian (2650_CR36) 2021; 16 2650_CR34 X Li (2650_CR17) 2018; 12 2650_CR18 S Saganowski (2650_CR19) 2022; 9 2650_CR38 2650_CR39 |
References_xml | – reference: Zhao, G., Zhang, Y., Zhang, G., Zhang, D. & Liu, Y.-J. Multi-target positive emotion recognition from EEG signals. IEEE Trans. Affect Comput. (2020). – reference: LiuY-JReal-Time Movie-Induced Discrete Emotion Recognition from EEG SignalsIEEE Trans. Affect Comput.2018955056210.1109/TAFFC.2017.2660485 – reference: ZhengW-LLuB-LInvestigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE TransAuton Ment Dev.2015716217510.1109/TAMD.2015.2431497 – reference: ChenJA large finer-grained affective computing EEG dataset202310.7303/syn50614194Synapse – reference: DingYHuXXiaZLiuY-JZhangDInter-Brain EEG Feature Extraction and Analysis for Continuous Implicit Emotion Tagging During Video WatchingIEEE Trans. Affect Comput.2021129210210.1109/TAFFC.2018.2849758 – reference: Zhang, Z., Zhong, S. & Liu, Y. GANSER: A Self-supervised Data Augmentation Framework for EEG-based Emotion Recognition. IEEE Trans. Affect Comput. (2022). – reference: AlarcaoSMFonsecaMJEmotions recognition using EEG signals: A surveyIEEE Trans. Affect Comput.20171037439310.1109/TAFFC.2017.2714671 – reference: de CheveignéAArzounianDRobust detrending, rereferencing, outlier detection, and inpainting for multichannel dataNeuroImage201817290391210.1016/j.neuroimage.2018.01.03529448077 – reference: YoonJPohlmeyerAEDesmetPWhen’feeling good’is not good enough: seven key opportunities for emotional granularity in product developmentInt. J. Des.201610115 – reference: FredricksonBLThe role of positive emotions in positive psychology: The broaden-and-build theory of positive emotionsAm Psychol.2001562181:STN:280:DC%2BD3Mzot1ygsg%3D%3D10.1037/0003-066X.56.3.218113152483122271 – reference: TrampeDQuoidbachJTaquetMEmotions in everyday lifePloS One201510e014545010.1371/journal.pone.0145450266981244689475 – reference: Nijholt, A. From word play to world play: introducing humor in human-computer interaction. in Proceedings of the 36th European Conference on Cognitive Ergonomics 1–8 (2018). – reference: HairstonWDUsability of four commercially-oriented EEG systemsJ. Neural Eng.2014110460182014JNEng..11d6018H10.1088/1741-2560/11/4/046018 – reference: HuXWangFZhangDSimilar brains blend emotion in similar ways: Neural representations of individual difference in emotion profilesNeuroimage202224711881910.1016/j.neuroimage.2021.11881934920085 – reference: Liu B, Huang X, Wang Y, Chen X, Gao X. BETA: A large benchmark database toward SSVEP-BCI application. Front. Neurosci. 14 (2020). – reference: LiXExploring EEG Features in Cross-Subject Emotion RecognitionFront. Neurosci.20181216210.3389/fnins.2018.00162296158535867345 – reference: Zhu, J.-Y., Zheng, W.-L. & Lu, B.-L. Cross-subject and Cross-gender Emotion Classification from EEG. in World Congress on Medical Physics and Biomedical Engineering, June 7-12, 2015, Toronto, Canada (ed. Jaffray, D. A.) vol. 51 1188–1191 (Springer International Publishing, 2015). – reference: EkmanPAn argument for basic emotionsCogn. Emot.1992616920010.1080/02699939208411068 – reference: HuXChenJWangFZhangDTen challenges for EEG-based affective computingBrain Sci. Adv.201951202019SciA....5....1H10.1177/2096595819896200 – reference: ZhaoL-MYanXLuB-LPlug-and-play domain adaptation for cross-subject EEG-based emotion recognitionProceedings of the AAAI Conference on Artificial Intelligence20213586387010.1609/aaai.v35i1.16169 – reference: SoleymaniMLichtenauerJPunTPanticMA multimodal database for affect recognition and implicit taggingIEEE Trans. Affect Comput.20113425510.1109/T-AFFC.2011.25 – reference: GeYZhaoGZhangYHoustonRJSongJA standardised database of Chinese emotional film clipsCogn. Emot.20193397699010.1080/02699931.2018.153019730293475 – reference: Gramfort, A. et al. MEG and EEG data analysis with MNE-Python. Front. Neurosci. 267 (2013). – reference: SongTMPED: A multi-modal physiological emotion database for discrete emotion recognitionIEEE Access20197121771219110.1109/ACCESS.2019.2891579 – reference: HuXfNIRS evidence for recognizably different positive emotionsFront. Hum. Neurosci.20191312010.3389/fnhum.2019.00120310242786465574 – reference: DillenLVKooleSLClearing the mind: A working memory model of distraction fromnegative feelingsEmotion2007771510.1037/1528-3542.7.4.71518039038 – reference: Fredrickson, B. L. Positive emotions broaden and build. in Advances in Experimental Social Psychology vol. 47 1–53 (Elsevier Press, 2013). – reference: Picard, R. W. Affective computing. (MIT press, 2000). – reference: Duan, R.-N., Zhu, J.-Y. & Lu, B.-L. Differential entropy feature for EEG-based emotion classification. in 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER) 81–84 (IEEE, 2013). – reference: KoelstraSDeap: A database for emotion analysis; using physiological signalsIEEE Trans. Affect. Comput.20113183110.1109/T-AFFC.2011.15 – reference: RoyYDeep learning-based electroencephalography analysis: a systematic reviewJ. Neural Eng.2019160510012019JNEng..16e1001R10.1088/1741-2552/ab260c31151119 – reference: Kleiner, M., Brainard, D. & Pelli, D. What’s new in Psychtoolbox-3? (2007). – reference: TianFHuaMZhangWLiYYangXEmotional arousal in 2D versus 3D virtual reality environmentsPloS One202116e02562111:CAS:528:DC%2BB3MXitVekurrE10.1371/journal.pone.0256211344996678428725 – reference: Zheng, W.-L., Zhu, J.-Y., Peng, Y. & Lu, B.-L. EEG-based emotion classification using deep belief networks. in 2014 IEEE International Conference on Multimedia and Expo (ICME) 1–6 (IEEE, 2014). – reference: HuXEEG correlates of ten positive emotionsFront. Hum. Neurosci.2017112610.3389/fnhum.2017.00026281841945266691 – reference: FrantzidisCAToward emotion aware computing: an integrated approach using multichannel neurophysiological recordings and affective visual stimuliIEEE T INF TECHNOL B.20101458959710.1109/TITB.2010.2041553 – reference: SaganowskiSEmognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearablesSci. Data.2022911110.1038/s41597-022-01262-0 – reference: SchaeferANilsFSanchezXPhilippotPAssessing the effectiveness of a large database of emotion-eliciting films: A new tool for emotion researchersCogn. Emot.2010241153117210.1080/02699930903274322 – reference: RayRDMcRaeKOchsnerKNGrossJJCognitive reappraisal of negative affect: converging evidence from EMG and self-reportEmotion20101058710.1037/a0019015206778754106258 – reference: ScheinostDTen simple rules for predictive modeling of individual differences in neuroimagingNeuroImage2019193354510.1016/j.neuroimage.2019.02.05730831310 – reference: Devillers, L. et al. Multimodal data collection of human-robot humorous interactions in the joker project. in 2015 International Conference on Affective Computing and Intelligent Interaction (ACII) 348–354 (IEEE, 2015). – reference: Zheng, W.-L. & Lu, B.-L. Personalizing EEG-based affective models with transfer learning. in Proceedings of the twenty-fifth International Joint Conference on Artificial Intelligence 2732–2738 (2016). – reference: Shen, X., Liu, X., Hu, X., Zhang, D. & Song, S. Contrastive learning of subject-invariant EEG representations for cross-subject emotion recognition. IEEE Trans. Affect Comput. (2022). – reference: YadegaridehkordiENoorNFBMAyubMNBAffalHBHussinNBAffective computing in education: A systematic review and future researchComput. Educ.201914210364910.1016/j.compedu.2019.103649 – volume: 16 start-page: e0256211 year: 2021 ident: 2650_CR36 publication-title: PloS One doi: 10.1371/journal.pone.0256211 – ident: 2650_CR14 – ident: 2650_CR1 doi: 10.7551/mitpress/1140.001.0001 – volume: 7 start-page: 12177 year: 2019 ident: 2650_CR20 publication-title: IEEE Access doi: 10.1109/ACCESS.2019.2891579 – volume: 9 start-page: 550 year: 2018 ident: 2650_CR15 publication-title: IEEE Trans. Affect Comput. doi: 10.1109/TAFFC.2017.2660485 – volume: 6 start-page: 169 year: 1992 ident: 2650_CR9 publication-title: Cogn. Emot. doi: 10.1080/02699939208411068 – volume: 7 start-page: 162 year: 2015 ident: 2650_CR22 publication-title: Auton Ment Dev. doi: 10.1109/TAMD.2015.2431497 – volume: 16 start-page: 051001 year: 2019 ident: 2650_CR24 publication-title: J. Neural Eng. doi: 10.1088/1741-2552/ab260c – ident: 2650_CR28 doi: 10.1109/ACII.2015.7344594 – ident: 2650_CR4 doi: 10.1109/TAFFC.2022.3164516 – ident: 2650_CR41 doi: 10.1109/ICME.2014.6890166 – volume: 10 start-page: 1 year: 2016 ident: 2650_CR7 publication-title: Int. J. Des. – volume: 3 start-page: 18 year: 2011 ident: 2650_CR21 publication-title: IEEE Trans. Affect. Comput. doi: 10.1109/T-AFFC.2011.15 – ident: 2650_CR26 doi: 10.1109/NER.2013.6695876 – volume: 7 start-page: 715 year: 2007 ident: 2650_CR33 publication-title: Emotion doi: 10.1037/1528-3542.7.4.715 – volume: 142 start-page: 103649 year: 2019 ident: 2650_CR2 publication-title: Comput. Educ. doi: 10.1016/j.compedu.2019.103649 – ident: 2650_CR11 doi: 10.1016/B978-0-12-407236-7.00001-2 – volume: 11 start-page: 26 year: 2017 ident: 2650_CR8 publication-title: Front. Hum. Neurosci. doi: 10.3389/fnhum.2017.00026 – volume: 247 start-page: 118819 year: 2022 ident: 2650_CR31 publication-title: Neuroimage doi: 10.1016/j.neuroimage.2021.118819 – volume: 3 start-page: 42 year: 2011 ident: 2650_CR35 publication-title: IEEE Trans. Affect Comput. doi: 10.1109/T-AFFC.2011.25 – ident: 2650_CR25 doi: 10.1109/TAFFC.2022.3170369 – volume: 56 start-page: 218 year: 2001 ident: 2650_CR12 publication-title: Am Psychol. doi: 10.1037/0003-066X.56.3.218 – volume: 11 start-page: 046018 year: 2014 ident: 2650_CR37 publication-title: J. Neural Eng. doi: 10.1088/1741-2560/11/4/046018 – volume: 10 start-page: 587 year: 2010 ident: 2650_CR32 publication-title: Emotion doi: 10.1037/a0019015 – ident: 2650_CR38 doi: 10.3389/fnins.2020.00627 – volume: 35 start-page: 863 year: 2021 ident: 2650_CR16 publication-title: Proceedings of the AAAI Conference on Artificial Intelligence doi: 10.1609/aaai.v35i1.16169 – ident: 2650_CR18 doi: 10.1007/978-3-319-19387-8_288 – volume: 14 start-page: 589 year: 2010 ident: 2650_CR42 publication-title: IEEE T INF TECHNOL B. doi: 10.1109/TITB.2010.2041553 – ident: 2650_CR34 – volume: 13 start-page: 120 year: 2019 ident: 2650_CR13 publication-title: Front. Hum. Neurosci. doi: 10.3389/fnhum.2019.00120 – volume: 12 start-page: 162 year: 2018 ident: 2650_CR17 publication-title: Front. Neurosci. doi: 10.3389/fnins.2018.00162 – volume: 5 start-page: 1 year: 2019 ident: 2650_CR6 publication-title: Brain Sci. Adv. doi: 10.1177/2096595819896200 – year: 2023 ident: 2650_CR43 doi: 10.7303/syn50614194 – volume: 12 start-page: 92 year: 2021 ident: 2650_CR5 publication-title: IEEE Trans. Affect Comput. doi: 10.1109/TAFFC.2018.2849758 – volume: 9 start-page: 1 year: 2022 ident: 2650_CR19 publication-title: Sci. Data. doi: 10.1038/s41597-022-01262-0 – volume: 33 start-page: 976 year: 2019 ident: 2650_CR30 publication-title: Cogn. Emot. doi: 10.1080/02699931.2018.1530197 – volume: 24 start-page: 1153 year: 2010 ident: 2650_CR29 publication-title: Cogn. Emot. doi: 10.1080/02699930903274322 – ident: 2650_CR27 doi: 10.1145/3232078.3232239 – ident: 2650_CR39 doi: 10.3389/fnins.2013.00267 – ident: 2650_CR44 – volume: 10 start-page: e0145450 year: 2015 ident: 2650_CR10 publication-title: PloS One doi: 10.1371/journal.pone.0145450 – volume: 10 start-page: 374 year: 2017 ident: 2650_CR3 publication-title: IEEE Trans. Affect Comput. doi: 10.1109/TAFFC.2017.2714671 – volume: 172 start-page: 903 year: 2018 ident: 2650_CR40 publication-title: NeuroImage doi: 10.1016/j.neuroimage.2018.01.035 – volume: 193 start-page: 35 year: 2019 ident: 2650_CR23 publication-title: NeuroImage doi: 10.1016/j.neuroimage.2019.02.057 |
SSID | ssj0001340570 |
Score | 2.422762 |
Snippet | Affective computing based on electroencephalogram (EEG) has gained increasing attention for its objectivity in measuring emotional states. While positive... Abstract Affective computing based on electroencephalogram (EEG) has gained increasing attention for its objectivity in measuring emotional states. While... |
SourceID | doaj pubmedcentral proquest crossref springer |
SourceType | Open Website Open Access Repository Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 740 |
SubjectTerms | 631/378/1457 631/477/2811 Data Descriptor Datasets EEG Electroencephalography Emotions Humanities and Social Sciences multidisciplinary Science Science (multidisciplinary) |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Bb9UwDLbQTlwQgyEKY8qkHUBQLW2SNT0-2HtMCDgxabcoTRxAQh3iddrfn532PdZJwIVrkyqNY8dfauczwFGyOiqCoaX29QkdUEwqfZK67HxtyQGhwZYvOH_6fHJ2rj9cmItbpb44J2ykBx4Fd9zYpBAJdEisdNdam5IPgYAB-i62KvLuSz7v1mEq_11RDETkdEtGKnu8Jk_FxKM1hy0JlpTXM0-UCftnKPNujuSdQGn2P6uH8GACjmIxfvAu3MP-EexOprkWLyf-6FePwSzER87vFiu-2Fd-5SIQGMUiZ27Q5ibGSg40ilgu34tTP5AnG_bgfLX88u6snKojlMFUdiixaRusLUaaaM3MWaqqVPA2EaYjOUVUnQxNWxmLTR2jkUmG5LVKScbUaq2ewE5_2eNTEIkrons0OgajdQotqhSM5NQIOj6iKaDaSMqFiTqcK1j8cDmErawbpetIui5L110X8Hr7zs-ROOOvvd_yAmx7Mul1fkCq4CZVcP9ShQL2N8vnJktcO1Y5mgTJqIDDbTPZEAdGfI-XV7kPAUtFcKcAO1v22QfNW_rv3zIbN52pM9Ap4M1GQ36P_ucZP_sfM34O92vWaPKltdmHneHXFb4gkDR0B9kebgAShQyt priority: 102 providerName: Directory of Open Access Journals – databaseName: ProQuest Health & Medical Collection dbid: 7X7 link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1RbxQhECZaX3wxbdW4tpo16YNGSdkFuuyTOetdG6M-2eTeCAtDa2L2am-b_n1nOO4u28S-LhCWGWA-mOEbxo6iUUEiDOXK1Sd4QNGRuygU71xt0ACBhpYeOP_4eXJ-ob7N9TxfuC1zWOV6T0wbdVh4uiM_pqZ4ltFN_fn6L6esUeRdzSk0HrMnRF1Gs7qZN9s7FklwROS3MkKa4yXaK6Ifrcl5ieCE343sUaLtH2HN-5GS99ylyQrNdtmzDB_LyUrfe-wR9PtsLy_QZfk-s0h_eM70pPxOUd7ljJ738UtKBQGhnKT4DdziylU-B-ylnE7Pyq9uQHs2vGAXs-mv03OecyRwryszcGjaBmoDAQdaE3-WrCrpnYmI7LrWBJCd8E1baQNNHYIWUfjolIxRhNgqJV-ynX7RwytWRsqL7kCr4LVS0bcgo9eCAiRQ8KALVq0lZX0mEKc8Fn9scmRLY1fStShdm6Rr7wr2cdPmekWf8WDtL6SATU2ivk4fFjeXNq8k25goARCFCqgUDtHE6LxHpAiuC60MBTtcq8_m9bi029lTsHebYlxJ5B5xPSxuUx2ElxJBT8HMSO2jHxqX9L-vEic3nqwT3CnYp_UM2fb-_xG_fvhnD9hTSm9PtrLWh2xnuLmFNwiChu5tmun_ANJ2AuU priority: 102 providerName: ProQuest – databaseName: Springer Nature HAS Fully OA dbid: AAJSJ link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Nb9QwEB2V9sIFUT5EoCAjcQBBhBPbjXMMsEu1Ai5QqTfLsccFCWVRN1X_PmOvs1UqQOIa27I9HnueM-M3AC-Cll4QDC2lrY_pgqJCaQOXZW9rTQYIFbbxgfPnL8cnp3J1ps72oJ7ewqSg_URpmY7pKTrs7YYMTeQNraPXkVBFeXULDiJVO-n2Qdetvq6u_6yICEJ4fiHDhf5D45kVSmT9M4R5Mz7yhpM02Z7lXbiTQSPrtsM8hD0c7sFh3pYb9jJzR7-6D6pjn2JsN1vGR33leUwAgZ51KWqDDja2zeJAvbDF4iP7YEeyYuMDOF0uvr0_KXNmhNKpSo8lNm2DtUZPE60ja5aoKuGsDoTn-lZ7FD13TVspjU3tveKBu2ClCIH70EopHsL-sB7wEbAQs6FbVNI7JWVwLYrgFI9hEXR1RFVANUnKuEwbHrNX_DTJfS202UrXkHRNkq65KuD1rs2vLWnGP2u_iwuwqxkJr9OH9cW5yQpgGh0EImFPjpWkKeoQrHOED9H2vhW-gKNp-UzehRsT1Y0mQTIq4PmumPZPdIrYAdeXqQ6BSkFQpwA9W_bZgOYlw4_viYmb7tMJ5BTwZtKQ697_PuPH_1f9CdyOSe6jxazVEeyPF5f4lKDQ2D_Luv8bNOUCdA priority: 102 providerName: Springer Nature |
Title | A Large Finer-grained Affective Computing EEG Dataset |
URI | https://link.springer.com/article/10.1038/s41597-023-02650-w https://www.proquest.com/docview/2881549572 https://www.proquest.com/docview/2882323021 https://pubmed.ncbi.nlm.nih.gov/PMC10600242 https://doaj.org/article/78f3ee5660e14b988ffacc095eabd93d |
Volume | 10 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Lb9QwEB71IaFeUMtDhJZVkDiAIJDE9sY5oCpddqlWtELASnuzHD9apCpLd1O1_HvGTrJVqsKBUxQ_5Hg8k_n8-gbgleVUE4ShEZXpECcozEbSxjQqZcrRARlmcnfB-eR0eDyj0zmbb0AX7qgV4OreqZ2LJzVbXry_ufx9iAb_sbkyzj-s0Ak5TtHU7Ugi4oiuN2EbPVPmIhqctHDfr7kQB0_i9u7M_VV34IGjWMe3Yc9VeUb_Hgy9e4jyzk6qd1CTXXjYIsuwaFRhDzZM9Qj2Wttdha9bguk3j4EV4Rd3ADycuJt_0ZmLEmF0WPijHfj3C5tQD9hKOB5_Dj_JGl1d_QRmk_GP0XHUhk-IFEt4HZksz0zKjcY-p45aiyQJUZJbBH1lzrUhZayyPGHcZKnWLLaxspISa2Ntc0rJU9iqFpV5BqF1IdOlYVQrRqlVuSFWsdidncD5pWEBJJ2khGq5xV2Iiwvh97gJF42gBQpaeEGL6wDeruv8apg1_ln6yA3AuqRjxfYJi-WZaI1MZNwSYxCgxiah2EVurVQKQaSRpc6JDuCgGz7RaZpwOomdQBkF8HKdjUbmdk5kZRZXvgwiT4J4KADeG_beB_Vzqp_nnq4bJ90eCQXwrtOQ29b_3uPn_9_SPuykTqXRxabsALbq5ZV5gdipLgewmc2zAWwXxfT7FJ9H49Ov3zB1NBwN_HrEwJvMH-YhGZU |
linkProvider | Scholars Portal |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9NAEB6VcoALojyEaQEjgQQCq_Y-6vUBoUATUpr21Eq5LfY-ClLllMZVxJ_iNzKzthOlEr31aq8fOzO7883OC-CNV8JyhKGJKNkeGijSJ6VPRVKVTKECctIVlOB8dLw3PhXfp3K6AX_7XBgKq-z3xLBR25mhM_JdehRtGZmzzxe_E-oaRd7VvoVGKxaH7s8CTbb5p4N95O9bxkbDk6_jpOsqkBiZqSZxeZE7ppxFpMKo4hTPMm5K5RELVYWyjlepyYtMKpcza2XqU-NLwb1PrS-E4PjeO3AXFW9Kxl4-zVdnOpzgT9rl5qRc7c5RP1K5U0bOUgRDyWJN_4U2AWvY9npk5jX3bNB6o4fwoIOr8aCVry3YcPUj2Oo2hHn8rqta_f4xyEE8oajyeETphMkZtZ5wNh6EeBHcUuO2fwR-JR4Ov8X7ZYP6s3kCp7dCvaewWc9q9wxiT33YSyeFNVIIbwrHvZEpBWQgo52MIOsppU1XsJz6Zpzr4DjnSrfU1UhdHairFxF8WD5z0ZbruHH0F2LAciSV2g4XZpdnulu5OleeO4eoN3WZwCkq70tjEJm6srIFtxHs9OzT3fqf65W0RvB6eRtXLrljytrNrsIYhLMcQVYEao3taz-0fqf-9TPUAEdLPsCrCD72ErL6-v9n_Pzmn30F98YnRxM9OTg-3Ib7jOQW9TSTO7DZXF65FwjAmuplkPoYftz2MvsH_20_OA |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3da9RAEB_qFcQXsX5gtGoEBUXDJfvRbB5Ert6dra1HEQt92yb7UQXJ1V7K4b_mX-dMPu5Iwb71NdlkNzOzM7_JzM4AvPJKWI4wNBI520EHRfoo97GIipwpNEBOuowOOH-d7ewdiy8n8mQD_nZnYSitstOJtaK2c0P_yIf0KPoyMmVD36ZFHI2nH89_R9RBiiKtXTuNRkQO3J8lum-LD_tj5PVrxqaT75_2orbDQGRkoqrIpVnqmHIWUQuj6lM8SbjJlUdcVGTKOl7EJs0SqVzKrJWxj43PBfc-tj4TguN7b8FmSl7RADZ3J7Ojb-s_PJzAUNye1Im5Gi7QWlLxU0ahU4RG0bJnDeumAT2kezVP80qwtraB03twtwWv4aiRti3YcOV92GrVwyJ809awfvsA5Cg8pBzzcEqHC6MzakThbDiqs0dQwYZNNwmcJZxMPofjvEJrWj2E4xuh3yMYlPPSPYbQU1f23ElhjRTCm8xxb2RM6RnIdicDSDpKadOWL6cuGr90HUbnSjfU1UhdXVNXLwN4t3rmvCnece3oXWLAaiQV3q4vzC_OdLuPdao8dw4xcOwSgZ-ovM-NQZzq8sJm3Aaw3bFPt9pgodeyG8DL1W3cxxScyUs3v6zHILjlCLkCUD229xbUv1P-_FFXBEe_vgZbAbzvJGQ9-_-_-Mn1i30Bt3GL6cP92cFTuMNIbNFoM7kNg-ri0j1DNFYVz1uxD-H0pnfaP3zZRNM |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+Large+Finer-grained+Affective+Computing+EEG+Dataset&rft.jtitle=Scientific+data&rft.au=Chen%2C+Jingjing&rft.au=Wang%2C+Xiaobin&rft.au=Huang%2C+Chen&rft.au=Hu%2C+Xin&rft.date=2023-10-25&rft.pub=Nature+Publishing+Group+UK&rft.eissn=2052-4463&rft.volume=10&rft_id=info:doi/10.1038%2Fs41597-023-02650-w&rft_id=info%3Apmid%2F37880266&rft.externalDocID=PMC10600242 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2052-4463&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2052-4463&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2052-4463&client=summon |