Multisource Transfer Learning for Cross-Subject EEG Emotion Recognition
Electroencephalogram (EEG) has been widely used in emotion recognition due to its high temporal resolution and reliability. Since the individual differences of EEG are large, the emotion recognition models could not be shared across persons, and we need to collect new labeled data to train personal...
Saved in:
Published in | IEEE transactions on cybernetics Vol. 50; no. 7; pp. 3281 - 3293 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.07.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Electroencephalogram (EEG) has been widely used in emotion recognition due to its high temporal resolution and reliability. Since the individual differences of EEG are large, the emotion recognition models could not be shared across persons, and we need to collect new labeled data to train personal models for new users. In some applications, we hope to acquire models for new persons as fast as possible, and reduce the demand for the labeled data amount. To achieve this goal, we propose a multisource transfer learning method, where existing persons are sources, and the new person is the target. The target data are divided into calibration sessions for training and subsequent sessions for test. The first stage of the method is source selection aimed at locating appropriate sources. The second is style transfer mapping, which reduces the EEG differences between the target and each source. We use few labeled data in the calibration sessions to conduct source selection and style transfer. Finally, we integrate the source models to recognize emotions in the subsequent sessions. The experimental results show that the three-category classification accuracy on benchmark SEED improves by 12.72% comparing with the nontransfer method. Our method facilitates the fast deployment of emotion recognition models by reducing the reliance on the labeled data amount, which has practical significance especially in fast-deployment scenarios. |
---|---|
AbstractList | Electroencephalogram (EEG) has been widely used in emotion recognition due to its high temporal resolution and reliability. Since the individual differences of EEG are large, the emotion recognition models could not be shared across persons, and we need to collect new labeled data to train personal models for new users. In some applications, we hope to acquire models for new persons as fast as possible, and reduce the demand for the labeled data amount. To achieve this goal, we propose a multisource transfer learning method, where existing persons are sources, and the new person is the target. The target data are divided into calibration sessions for training and subsequent sessions for test. The first stage of the method is source selection aimed at locating appropriate sources. The second is style transfer mapping, which reduces the EEG differences between the target and each source. We use few labeled data in the calibration sessions to conduct source selection and style transfer. Finally, we integrate the source models to recognize emotions in the subsequent sessions. The experimental results show that the three-category classification accuracy on benchmark SEED improves by 12.72% comparing with the nontransfer method. Our method facilitates the fast deployment of emotion recognition models by reducing the reliance on the labeled data amount, which has practical significance especially in fast-deployment scenarios. Electroencephalogram (EEG) has been widely used in emotion recognition due to its high temporal resolution and reliability. Since the individual differences of EEG are large, the emotion recognition models could not be shared across persons, and we need to collect new labeled data to train personal models for new users. In some applications, we hope to acquire models for new persons as fast as possible, and reduce the demand for the labeled data amount. To achieve this goal, we propose a multisource transfer learning method, where existing persons are sources, and the new person is the target. The target data are divided into calibration sessions for training and subsequent sessions for test. The first stage of the method is source selection aimed at locating appropriate sources. The second is style transfer mapping, which reduces the EEG differences between the target and each source. We use few labeled data in the calibration sessions to conduct source selection and style transfer. Finally, we integrate the source models to recognize emotions in the subsequent sessions. The experimental results show that the three-category classification accuracy on benchmark SEED improves by 12.72% comparing with the nontransfer method. Our method facilitates the fast deployment of emotion recognition models by reducing the reliance on the labeled data amount, which has practical significance especially in fast-deployment scenarios.Electroencephalogram (EEG) has been widely used in emotion recognition due to its high temporal resolution and reliability. Since the individual differences of EEG are large, the emotion recognition models could not be shared across persons, and we need to collect new labeled data to train personal models for new users. In some applications, we hope to acquire models for new persons as fast as possible, and reduce the demand for the labeled data amount. To achieve this goal, we propose a multisource transfer learning method, where existing persons are sources, and the new person is the target. The target data are divided into calibration sessions for training and subsequent sessions for test. The first stage of the method is source selection aimed at locating appropriate sources. The second is style transfer mapping, which reduces the EEG differences between the target and each source. We use few labeled data in the calibration sessions to conduct source selection and style transfer. Finally, we integrate the source models to recognize emotions in the subsequent sessions. The experimental results show that the three-category classification accuracy on benchmark SEED improves by 12.72% comparing with the nontransfer method. Our method facilitates the fast deployment of emotion recognition models by reducing the reliance on the labeled data amount, which has practical significance especially in fast-deployment scenarios. |
Author | Liu, Cheng-Lin Shen, Yuan-Yuan He, Huiguang Qiu, Shuang Li, Jinpeng |
Author_xml | – sequence: 1 givenname: Jinpeng surname: Li fullname: Li, Jinpeng email: lijinpeng2015@ia.ac.cn organization: Research Center for Brain-Inspired Intelligence, National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China – sequence: 2 givenname: Shuang surname: Qiu fullname: Qiu, Shuang email: shuang.qiu@ia.ac.cn organization: Research Center for Brain-Inspired Intelligence, National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China – sequence: 3 givenname: Yuan-Yuan surname: Shen fullname: Shen, Yuan-Yuan email: shenyuanyuan2015@ia.ac.cn organization: School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China – sequence: 4 givenname: Cheng-Lin orcidid: 0000-0002-6743-4175 surname: Liu fullname: Liu, Cheng-Lin email: liucl@nlpr.ia.ac.cn organization: School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China – sequence: 5 givenname: Huiguang orcidid: 0000-0002-0684-1711 surname: He fullname: He, Huiguang email: huiguang.he@ia.ac.cn organization: Research Center for Brain-Inspired Intelligence, National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/30932860$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kc1qGzEUhUVIadI0DxACYaCbbsbV_0jLxrhuwaWQuIuuhEZzJ8iMpVSaWfTtq8GuF15UG12k71xdnfMOXYYYAKE7gheEYP1pu_z1uKCY6AXVmGNBL9A1JVLVlDbi8lTL5grd5rzDZalypNVbdMWwZlRJfI3W36dh9DlOyUG1TTbkHlK1AZuCDy9VH1O1TDHn-nlqd-DGarVaV6t9HH0M1RO4-BL8XL9Hb3o7ZLg97jfo55fVdvm13vxYf1t-3tSOEzHWvdCsk2Uo2vFOEehJJ5RusQIhGqt7TV3XMtEyyUEQQRvOWgvlwjFsuXTsBn089H1N8fcEeTR7nx0Mgw0Qp2xocaQhUmJc0A9n6K58M5TpDOWEC8aF5IV6OFJTu4fOvCa_t-mP-WdRAcgBcLMPCfoTQrCZkzBzEmZOwhyTKJrmTOP8aGefxmT98F_l_UHpAeD0kpKN4I1ifwHczpNJ |
CODEN | ITCEB8 |
CitedBy_id | crossref_primary_10_1109_TNNLS_2022_3220551 crossref_primary_10_1109_TAFFC_2023_3261867 crossref_primary_10_1109_TCYB_2020_3015664 crossref_primary_10_1109_TNNLS_2023_3305621 crossref_primary_10_1109_TII_2022_3217120 crossref_primary_10_1007_s13534_024_00431_x crossref_primary_10_1109_JBHI_2023_3335854 crossref_primary_10_1109_TNSRE_2022_3219418 crossref_primary_10_1109_ACCESS_2024_3355977 crossref_primary_10_7717_peerj_cs_2649 crossref_primary_10_1088_1741_2552_acae06 crossref_primary_10_11834_jig_230031 crossref_primary_10_1109_TETCI_2023_3306351 crossref_primary_10_1109_TAFFC_2024_3433470 crossref_primary_10_3389_fnhum_2022_921346 crossref_primary_10_1109_TNSRE_2024_3445115 crossref_primary_10_1109_TFUZZ_2023_3288479 crossref_primary_10_1109_TCYB_2020_3033005 crossref_primary_10_1109_TCDS_2020_3007453 crossref_primary_10_1109_ACCESS_2023_3275565 crossref_primary_10_1007_s11571_024_10196_9 crossref_primary_10_1109_JBHI_2023_3238421 crossref_primary_10_1109_ACCESS_2024_3454082 crossref_primary_10_3389_fnins_2022_850906 crossref_primary_10_1109_TCSS_2022_3153660 crossref_primary_10_1109_TNSRE_2023_3236687 crossref_primary_10_1109_TCSS_2024_3386621 crossref_primary_10_1109_TAFFC_2024_3409357 crossref_primary_10_1109_TCDS_2021_3098842 crossref_primary_10_1109_TBME_2024_3404131 crossref_primary_10_1109_TAFFC_2024_3392791 crossref_primary_10_1109_TIM_2023_3277985 crossref_primary_10_3389_fncom_2019_00053 crossref_primary_10_1109_TNSRE_2023_3243257 crossref_primary_10_1088_1361_6579_ad4e95 crossref_primary_10_1109_TNSRE_2020_3023761 crossref_primary_10_1007_s00371_023_02780_7 crossref_primary_10_1016_j_jneumeth_2020_108885 crossref_primary_10_1109_TAFFC_2023_3336531 crossref_primary_10_1016_j_bspc_2021_103361 crossref_primary_10_1109_TIM_2024_3451593 crossref_primary_10_1109_JBHI_2020_3025865 crossref_primary_10_1109_TAFFC_2024_3394436 crossref_primary_10_1016_j_bspc_2022_103485 crossref_primary_10_1515_jisys_2023_0290 crossref_primary_10_3389_fpsyg_2022_924793 crossref_primary_10_1049_cit2_12174 crossref_primary_10_1007_s00521_022_07726_z crossref_primary_10_1109_TNSRE_2023_3309815 crossref_primary_10_1109_TCYB_2024_3390805 crossref_primary_10_1155_2021_9967592 crossref_primary_10_3389_fnins_2023_1288580 crossref_primary_10_3389_fpsyg_2022_899983 crossref_primary_10_1109_TASE_2023_3323773 crossref_primary_10_1109_TCDS_2021_3071170 crossref_primary_10_1016_j_physa_2019_122697 crossref_primary_10_1109_TETCI_2023_3301385 crossref_primary_10_3389_fnhum_2021_621493 crossref_primary_10_1109_TAFFC_2023_3288118 crossref_primary_10_1088_1741_2552_ac5eb7 crossref_primary_10_1007_s10462_022_10355_6 crossref_primary_10_1109_TBME_2024_3474049 crossref_primary_10_1109_TAFFC_2024_3433613 crossref_primary_10_1016_j_amc_2020_125065 crossref_primary_10_1109_ACCESS_2023_3297873 crossref_primary_10_1109_TCYB_2020_3022647 crossref_primary_10_1007_s10994_021_05961_4 crossref_primary_10_1109_TNNLS_2020_3026876 crossref_primary_10_1007_s10489_024_05662_0 crossref_primary_10_3389_fncom_2021_737324 crossref_primary_10_1016_j_cnsns_2019_105043 crossref_primary_10_1016_j_bspc_2021_102817 crossref_primary_10_1109_TAFFC_2023_3288885 crossref_primary_10_1109_TNNLS_2023_3287181 crossref_primary_10_3389_fpsyg_2021_809459 crossref_primary_10_1109_TCYB_2021_3052813 crossref_primary_10_1038_s41467_025_58265_9 crossref_primary_10_1111_coin_12659 crossref_primary_10_3233_MGS_220333 crossref_primary_10_1088_1741_2552_ad6598 crossref_primary_10_1088_1741_2552_ad184f crossref_primary_10_1016_j_bspc_2021_103023 crossref_primary_10_1007_s41870_024_02001_x crossref_primary_10_1109_JPROC_2023_3277471 crossref_primary_10_1109_JBHI_2022_3233717 crossref_primary_10_3389_fnins_2024_1355512 crossref_primary_10_1109_TCYB_2021_3071860 crossref_primary_10_1145_3712259 crossref_primary_10_1016_j_asoc_2020_106661 crossref_primary_10_1088_1741_2552_ac5c8d crossref_primary_10_1016_j_asoc_2020_106884 crossref_primary_10_1016_j_chaos_2021_110671 crossref_primary_10_1109_JSEN_2022_3168572 crossref_primary_10_1109_ACCESS_2024_3452781 crossref_primary_10_3390_e22010096 crossref_primary_10_1007_s11042_023_16941_2 crossref_primary_10_1109_JBHI_2023_3311338 crossref_primary_10_1109_JSEN_2022_3172133 crossref_primary_10_1109_TAFFC_2022_3164516 crossref_primary_10_1080_27706710_2023_2222159 crossref_primary_10_3233_JIFS_179457 crossref_primary_10_1016_j_neunet_2020_07_034 crossref_primary_10_1109_TCYB_2020_2974688 crossref_primary_10_1016_j_knosys_2020_105527 crossref_primary_10_1109_TCSS_2023_3314508 crossref_primary_10_1109_JBHI_2024_3512584 crossref_primary_10_1109_TIM_2023_3302938 crossref_primary_10_3389_fnins_2023_1213099 crossref_primary_10_1016_j_jfranklin_2019_09_029 crossref_primary_10_1109_TCDS_2024_3357547 crossref_primary_10_3389_fnins_2021_611653 crossref_primary_10_1016_j_petrol_2021_109423 crossref_primary_10_1142_S0219519421400479 crossref_primary_10_1587_transinf_2021EDP7171 crossref_primary_10_1109_TNSRE_2022_3207494 crossref_primary_10_1109_ACCESS_2023_3322294 crossref_primary_10_1515_bmt_2019_0306 crossref_primary_10_1088_1741_2552_ad5fbd crossref_primary_10_3389_fnhum_2021_685173 crossref_primary_10_3389_fnhum_2021_706270 crossref_primary_10_3389_fnins_2022_1000716 crossref_primary_10_1088_1741_2552_ad5761 crossref_primary_10_3389_fcomp_2020_00009 crossref_primary_10_1007_s41870_024_02400_0 crossref_primary_10_1109_ACCESS_2021_3134628 crossref_primary_10_1109_TCYB_2021_3071244 crossref_primary_10_3389_fnins_2021_778488 crossref_primary_10_3389_fpsyg_2021_721266 crossref_primary_10_3389_fnhum_2023_1132254 crossref_primary_10_1016_j_inffus_2023_102129 crossref_primary_10_1109_JBHI_2023_3307606 crossref_primary_10_1016_j_neunet_2021_01_011 crossref_primary_10_1016_j_neucom_2019_07_045 crossref_primary_10_1109_TAFFC_2021_3137857 crossref_primary_10_1016_j_nahs_2020_100970 crossref_primary_10_1109_TASE_2024_3409621 crossref_primary_10_1109_TCYB_2020_2987575 crossref_primary_10_1109_TNSRE_2022_3150007 crossref_primary_10_1109_TNNLS_2024_3350085 crossref_primary_10_1109_TBME_2023_3274231 crossref_primary_10_1109_TII_2022_3174063 crossref_primary_10_1109_ACCESS_2023_3344476 crossref_primary_10_1016_j_bspc_2021_102972 crossref_primary_10_1109_TIM_2024_3428607 crossref_primary_10_3233_ICA_200620 crossref_primary_10_1109_TAFFC_2022_3189222 crossref_primary_10_1109_TAFFC_2023_3305982 crossref_primary_10_1109_TCYB_2021_3090770 crossref_primary_10_1088_1741_2552_ad9956 crossref_primary_10_1080_00207721_2020_1797229 crossref_primary_10_1109_TNSRE_2023_3268751 crossref_primary_10_1016_j_neunet_2021_03_016 crossref_primary_10_3389_fnhum_2020_605246 crossref_primary_10_1016_j_neucom_2020_04_104 crossref_primary_10_1109_TCYB_2021_3051005 crossref_primary_10_1109_TITS_2021_3120435 |
Cites_doi | 10.1371/journal.pone.0002967 10.1109/79.911197 10.1109/TNN.2010.2091281 10.1016/0028-3932(85)90081-8 10.1080/2326263X.2014.912881 10.1109/CVPR.2010.5539857 10.1109/TSMCB.2010.2042445 10.1109/IJCNN.2012.6252390 10.1109/TAMD.2015.2431497 10.1109/NER.2013.6695876 10.1103/PhysRevLett.103.214101 10.1109/CVPR.2013.451 10.1145/1401890.1401928 10.1109/MSP.2014.2347059 10.1109/86.895946 10.1109/TPAMI.1987.4767881 10.1109/TAFFC.2017.2714671 10.1093/scan/nss092 10.1002/int.1068 10.1109/MCI.2015.2501545 10.1109/TBME.2010.2048568 10.1080/19479832.2010.485935 10.1109/TAFFC.2017.2712143 10.1109/5326.661099 10.1109/TKDE.2009.191 10.1109/TPAMI.2012.239 10.1016/j.neunet.2009.06.003 10.1016/j.cviu.2015.09.015 10.1145/2647868.2654916 10.1109/CVPR.2011.5995661 10.1016/j.neuroimage.2015.02.015 10.1109/JCSSE.2013.6567313 10.1109/TNNLS.2011.2178556 10.4018/978-1-60566-766-9.ch011 10.1088/1741-2552/aab2f2 10.1088/1741-2560/9/2/026013 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020 |
DBID | 97E RIA RIE AAYXX CITATION CGR CUY CVF ECM EIF NPM 7SC 7SP 7TB 8FD F28 FR3 H8D JQ2 L7M L~C L~D 7X8 |
DOI | 10.1109/TCYB.2019.2904052 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed Computer and Information Systems Abstracts Electronics & Communications Abstracts Mechanical & Transportation Engineering Abstracts Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database Aerospace Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Aerospace Database Technology Research Database Computer and Information Systems Abstracts – Academic Mechanical & Transportation Engineering Abstracts Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Engineering Research Database Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering Computer and Information Systems Abstracts Professional MEDLINE - Academic |
DatabaseTitleList | MEDLINE Aerospace Database MEDLINE - Academic |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 3 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Sciences (General) |
EISSN | 2168-2275 |
EndPage | 3293 |
ExternalDocumentID | 30932860 10_1109_TCYB_2019_2904052 8675478 |
Genre | orig-research Journal Article |
GrantInformation_xml | – fundername: Beijing Municipal Science and Technology Commission grantid: Z181100008918010 funderid: 10.13039/501100009592 – fundername: Strategic Priority Research Program of CAS funderid: 10.13039/501100002367 – fundername: National Natural Science Foundation of China grantid: 91520202; 81701785 funderid: 10.13039/501100001809 – fundername: Youth Innovation Promotion Association CAS funderid: 10.13039/501100002367 – fundername: Chinese Academy of Sciences (CAS) Scientific Equipment Development Project grantid: YJKYYQ20170050 funderid: 10.13039/501100002367 |
GroupedDBID | 0R~ 4.4 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACIWK AENEX AGQYO AGSQL AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD HZ~ IFIPE IPLJI JAVBF M43 O9- OCL PQQKQ RIA RIE RNS AAYXX CITATION RIG CGR CUY CVF ECM EIF NPM 7SC 7SP 7TB 8FD F28 FR3 H8D JQ2 L7M L~C L~D 7X8 |
ID | FETCH-LOGICAL-c415t-f593d62672d4d81ef1d589b08e557a9f92cdb35b364e5152743baea9fc30a46c3 |
IEDL.DBID | RIE |
ISSN | 2168-2267 2168-2275 |
IngestDate | Fri Jul 11 14:06:57 EDT 2025 Mon Jun 30 03:53:52 EDT 2025 Thu Apr 03 06:56:33 EDT 2025 Tue Jul 01 00:53:53 EDT 2025 Thu Apr 24 23:07:40 EDT 2025 Wed Aug 27 02:38:05 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 7 |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c415t-f593d62672d4d81ef1d589b08e557a9f92cdb35b364e5152743baea9fc30a46c3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0002-6743-4175 0000-0002-0684-1711 |
PMID | 30932860 |
PQID | 2414534564 |
PQPubID | 85422 |
PageCount | 13 |
ParticipantIDs | crossref_primary_10_1109_TCYB_2019_2904052 proquest_journals_2414534564 pubmed_primary_30932860 proquest_miscellaneous_2201716600 crossref_citationtrail_10_1109_TCYB_2019_2904052 ieee_primary_8675478 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2020-07-01 |
PublicationDateYYYYMMDD | 2020-07-01 |
PublicationDate_xml | – month: 07 year: 2020 text: 2020-07-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: Piscataway |
PublicationTitle | IEEE transactions on cybernetics |
PublicationTitleAbbrev | TCYB |
PublicationTitleAlternate | IEEE Trans Cybern |
PublicationYear | 2020 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | (ref39) 2012 ref35 ref13 ref34 ref12 ref15 ref36 ref14 ref31 zheng (ref9) 2016 ref30 ref33 huang (ref24) 2007; 19 rohrbach (ref37) 2013 ref2 ref17 ref38 ref16 ref19 argyriou (ref26) 2007; 19 weston (ref32) 1998 ref18 picard (ref1) 1995 ref46 ref45 ref23 ref48 ref47 ref25 ref20 ref42 ref22 schwaighofer (ref27) 2005 ref21 li (ref10) 2017; 10 ref28 collobert (ref29) 2006; 7 ref8 ref7 shi (ref41) 2013 arthur (ref44) 2007 ref4 ref3 lin (ref11) 2010; 57 ref6 zhu (ref43) 2010; 40 ref5 ref40 |
References_xml | – ident: ref18 doi: 10.1371/journal.pone.0002967 – volume: 7 start-page: 1687 year: 2006 ident: ref29 article-title: Large scale transductive SVMs publication-title: J Mach Learn Res – ident: ref2 doi: 10.1109/79.911197 – start-page: 6627 year: 2013 ident: ref41 article-title: Differential entropy feature for EEG-based vigilance estimation publication-title: Proc IEEE Eng Med Biol Soc Annu Int Conf (EMBC) – start-page: 1027 year: 2007 ident: ref44 article-title: k-means++: The advantages of careful seeding publication-title: Proc 18th Annu ACM-SIAM Symp Discr Algorithms – year: 1998 ident: ref32 article-title: Multiclass support vector machines – ident: ref25 doi: 10.1109/TNN.2010.2091281 – start-page: 46 year: 2013 ident: ref37 article-title: Transfer learning in a transductive setting publication-title: Proc Adv Neural Inf Process Syst – year: 2012 ident: ref39 publication-title: TransCranialTechnologies 10/20 System Positioning-Manual – ident: ref3 doi: 10.1016/0028-3932(85)90081-8 – ident: ref5 doi: 10.1080/2326263X.2014.912881 – ident: ref31 doi: 10.1109/CVPR.2010.5539857 – volume: 10 start-page: 1 year: 2017 ident: ref10 article-title: Hierarchical convolutional neural networks for EEG-based emotion recognition publication-title: Cogn Comput – volume: 19 start-page: 41 year: 2007 ident: ref26 article-title: Multitask feature learning publication-title: Proc Adv Neural Inf Process Syst – volume: 40 start-page: 1607 year: 2010 ident: ref43 article-title: Active learning from stream data using optimal weight classifier ensemble publication-title: IEEE Trans Syst Man Cybern B Cybern doi: 10.1109/TSMCB.2010.2042445 – ident: ref46 doi: 10.1109/IJCNN.2012.6252390 – ident: ref7 doi: 10.1109/TAMD.2015.2431497 – ident: ref40 doi: 10.1109/NER.2013.6695876 – ident: ref21 doi: 10.1103/PhysRevLett.103.214101 – ident: ref23 doi: 10.1109/CVPR.2013.451 – ident: ref28 doi: 10.1145/1401890.1401928 – year: 1995 ident: ref1 publication-title: Affective Computing – ident: ref14 doi: 10.1109/MSP.2014.2347059 – ident: ref17 doi: 10.1109/86.895946 – volume: 19 start-page: 601 year: 2007 ident: ref24 article-title: Correcting sample selection bias by unlabeled data publication-title: Proc Adv Neural Inf Process Syst – ident: ref35 doi: 10.1109/TPAMI.1987.4767881 – ident: ref6 doi: 10.1109/TAFFC.2017.2714671 – ident: ref47 doi: 10.1093/scan/nss092 – ident: ref33 doi: 10.1002/int.1068 – ident: ref8 doi: 10.1109/MCI.2015.2501545 – volume: 57 start-page: 1798 year: 2010 ident: ref11 article-title: EEG-based emotion recognition in music listening publication-title: IEEE Trans Biomed Eng doi: 10.1109/TBME.2010.2048568 – ident: ref42 doi: 10.1080/19479832.2010.485935 – start-page: 1209 year: 2005 ident: ref27 article-title: Learning Gaussian process kernels via hierarchical Bayes publication-title: Proc Adv Neural Inf Process Syst – ident: ref38 doi: 10.1109/TAFFC.2017.2712143 – ident: ref34 doi: 10.1109/5326.661099 – ident: ref13 doi: 10.1109/TKDE.2009.191 – ident: ref15 doi: 10.1109/TPAMI.2012.239 – ident: ref19 doi: 10.1016/j.neunet.2009.06.003 – ident: ref4 doi: 10.1016/j.cviu.2015.09.015 – ident: ref30 doi: 10.1145/2647868.2654916 – ident: ref16 doi: 10.1109/CVPR.2011.5995661 – start-page: 2732 year: 2016 ident: ref9 article-title: Personalizing EEG-based affective models with transfer learning publication-title: Proc Intern Joint Conf Artificial Intel (IJCAI) – ident: ref20 doi: 10.1016/j.neuroimage.2015.02.015 – ident: ref45 doi: 10.1109/JCSSE.2013.6567313 – ident: ref48 doi: 10.1109/TNNLS.2011.2178556 – ident: ref36 doi: 10.4018/978-1-60566-766-9.ch011 – ident: ref12 doi: 10.1088/1741-2552/aab2f2 – ident: ref22 doi: 10.1088/1741-2560/9/2/026013 |
SSID | ssj0000816898 |
Score | 2.6264718 |
Snippet | Electroencephalogram (EEG) has been widely used in emotion recognition due to its high temporal resolution and reliability. Since the individual differences of... |
SourceID | proquest pubmed crossref ieee |
SourceType | Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 3281 |
SubjectTerms | Adult Brain - physiology Brain modeling Brain-Computer Interfaces Brain–computer interface Calibration Data models Electroencephalography Electroencephalography - classification Electroencephalography - methods Emotion recognition Emotions Emotions - classification Humans Learning Machine Learning Mapping Pattern Recognition, Automated - methods Temporal resolution Training Training data transfer learning (TL) Young Adult |
Title | Multisource Transfer Learning for Cross-Subject EEG Emotion Recognition |
URI | https://ieeexplore.ieee.org/document/8675478 https://www.ncbi.nlm.nih.gov/pubmed/30932860 https://www.proquest.com/docview/2414534564 https://www.proquest.com/docview/2201716600 |
Volume | 50 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB61PXEB2vIItMhIHADhbdaPjX0sq20rJDigViqnyI8JB1AWtbsXfj1jxwkSAtSbJdtJ7Blnvnl4BuBVdL6JzgTuRBO5UlpxL4TlVnkdzdwRgs0Bsp8WF1fqw7W-3oF3010YRMzBZzhLzezLj-uwTaayE0PoVjVmF3ZJcRvuak32lFxAIpe-FdTghCqa4sSc1_bkcvnlfYrjsjNhiW11KmKTfIDC5NyUvyVSLrHyb7SZpc7ZA_g4fu8QbPJttt34Wfj5RyrHuy7oIdwv8JOdDvyyDzvYH8B-OeC37HXJQv3mEM7z1dzBts-yROvwhpV0rF8ZYV22TOvh9OtJthy2Wp2z1VAUiH0ew5LW_SO4OltdLi94qbrAAwnzDe-0lZHUnEZERdTCbh61sb42qHXjbGdFiF5qLxcKdSqKq6R3SB1B1k4tgnwMe_26x6fA0LhF5xsnMTqFUbiukyYZXWmi1wErqMedb0NJSZ4qY3xvs2pS2zbRrU10awvdKng7Tfkx5OP43-DDtOfTwLLdFRyN5G3Lib1tCckoLVNynQpeTt101pIDxfW43tIYkbMLEUas4MnAFtOzR2569vd3Pod7ImnqOdD3CPY2N1s8Jjiz8S8yH_8CkmftaQ |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB6V9gAXoJRH2kKNxAEQ3mb92NhHutp2gbYHtJXKKfIrHEDZqt298Os7dpwgIai4WbKdxJ5x5vN4_A3AG29s5Y1y1LDKUyGkoJYxTbWw0quxQQSbAmTPJ_ML8flSXm7Ah-EuTAghBZ-FUSyms3y_dOvoKjtUiG5Fpe7BFtp9ybrbWoNHJaWQSMlvGRYo4ooqH2OOS324mH47ipFcesQ0Kq6MaWziKSBTiZ3yt01KSVb-jTeT3Tl-BGf9F3fhJj9G65UduV9_kDn-75Aew8MMQMnHTmO2YSO0T2A7L_Eb8jbzUL_bgZN0Obfz7pNk05pwTTIh63eCaJdM43go_nyiN4fMZidk1qUFIl_7wKRl-xQujmeL6ZzmvAvUoTlf0UZq7nGjUzEvUF6hGXuptC1VkLIyutHMecul5RMRZEyLK7g1ASscL42YOP4MNttlG14ACcpMGlsZHrwRwTPTNFxFtyt2tNKFAsp-5muXScljboyfddqclLqOcquj3OostwLeD12uOkaOuxrvxDkfGubpLmC_F2-d1-xNjVhGSB7pdQp4PVTjaotHKKYNyzW2YYlfCFFiAc87tRie3WvT7t_feQD354uz0_r00_mXPXjA4r49hf3uw-bqeh1eIrhZ2VdJp28BIMXwsw |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Multisource+Transfer+Learning+for+Cross-Subject+EEG+Emotion+Recognition&rft.jtitle=IEEE+transactions+on+cybernetics&rft.au=Li%2C+Jinpeng&rft.au=Qiu%2C+Shuang&rft.au=Yuan-Yuan%2C+Shen&rft.au=Cheng-Lin%2C+Liu&rft.date=2020-07-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=2168-2267&rft.eissn=2168-2275&rft.volume=50&rft.issue=7&rft.spage=3281&rft_id=info:doi/10.1109%2FTCYB.2019.2904052&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2168-2267&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2168-2267&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2168-2267&client=summon |