Parallel Spatial–Temporal Self-Attention CNN-Based Motor Imagery Classification for BCI
Motor imagery (MI) electroencephalography (EEG) classification is an important part of the brain-computer interface (BCI), allowing people with mobility problems to communicate with the outside world via assistive devices. However, EEG decoding is a challenging task because of its complexity, dynami...
Saved in:
Published in | Frontiers in neuroscience Vol. 14; p. 587520 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Switzerland
Frontiers Research Foundation
11.12.2020
Frontiers Media S.A |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Motor imagery (MI) electroencephalography (EEG) classification is an important part of the brain-computer interface (BCI), allowing people with mobility problems to communicate with the outside world via assistive devices. However, EEG decoding is a challenging task because of its complexity, dynamic nature, and low signal-to-noise ratio. Designing an end-to-end framework that fully extracts the high-level features of EEG signals remains a challenge. In this study, we present a parallel spatial–temporal self-attention-based convolutional neural network for four-class MI EEG signal classification. This study is the first to define a new spatial-temporal representation of raw EEG signals that uses the self-attention mechanism to extract distinguishable spatial–temporal features. Specifically, we use the spatial self-attention module to capture the spatial dependencies between the channels of MI EEG signals. This module updates each channel by aggregating features over all channels with a weighted summation, thus improving the classification accuracy and eliminating the artifacts caused by manual channel selection. Furthermore, the temporal self-attention module encodes the global temporal information into features for each sampling time step, so that the high-level temporal features of the MI EEG signals can be extracted in the time domain. Quantitative analysis shows that our method outperforms state-of-the-art methods for intra-subject and inter-subject classification, demonstrating its robustness and effectiveness. In terms of qualitative analysis, we perform a visual inspection of the new spatial–temporal representation estimated from the learned architecture. Finally, the proposed method is employed to realize control of drones based on EEG signal, verifying its feasibility in real-time applications. |
---|---|
AbstractList | Motor imagery (MI) electroencephalography (EEG) classification is an important part of the brain--computer interface (BCI), allowing people with mobility problems to communicate with the outside world via assistive devices. However, EEG decoding is a challenging task because of its complexity, dynamic nature, and low signal-to-noise ratio. Designing an end-to-end framework that fully extracts the high-level features of EEG signals remains a challenge. In this study, we present a parallel spatial--temporal self-attention-based convolutional neural network for four-class MI EEG signal classification. This study is the first to define a new spatial--temporal representation of raw EEG signals that uses the self-attention mechanism to extract distinguishable spatial--temporal features. Specifically, we use the spatial self-attention module to capture the spatial dependencies between the channels of MI EEG signals. This module updates each channel by aggregating features over all channels with a weighted summation, thus improving the classification accuracy and eliminating the artifacts caused by manual channel selection. Furthermore, the temporal self-attention module encodes the global temporal information into features for each sampling time step, so that the high-level temporal features of the MI EEG signals can be extracted in the time domain. Quantitative analysis shows that our method outperforms state-of-the-art methods for intra-subject and inter-subject classification, demonstrating its robustness and effectiveness. In terms of qualitative analysis, we perform a visual inspection of the new spatial--temporal representation estimated from the learned architecture. Finally, the proposed method is employed to realize control of drones based on EEG signal, verifying its feasibility in real-time applications. Motor imagery (MI) electroencephalography (EEG) classification is an important part of the brain-computer interface (BCI), allowing people with mobility problems to communicate with the outside world via assistive devices. However, EEG decoding is a challenging task because of its complexity, dynamic nature, and low signal-to-noise ratio. Designing an end-to-end framework that fully extracts the high-level features of EEG signals remains a challenge. In this study, we present a parallel spatial-temporal self-attention-based convolutional neural network for four-class MI EEG signal classification. This study is the first to define a new spatial-temporal representation of raw EEG signals that uses the self-attention mechanism to extract distinguishable spatial-temporal features. Specifically, we use the spatial self-attention module to capture the spatial dependencies between the channels of MI EEG signals. This module updates each channel by aggregating features over all channels with a weighted summation, thus improving the classification accuracy and eliminating the artifacts caused by manual channel selection. Furthermore, the temporal self-attention module encodes the global temporal information into features for each sampling time step, so that the high-level temporal features of the MI EEG signals can be extracted in the time domain. Quantitative analysis shows that our method outperforms state-of-the-art methods for intra-subject and inter-subject classification, demonstrating its robustness and effectiveness. In terms of qualitative analysis, we perform a visual inspection of the new spatial-temporal representation estimated from the learned architecture. Finally, the proposed method is employed to realize control of drones based on EEG signal, verifying its feasibility in real-time applications.Motor imagery (MI) electroencephalography (EEG) classification is an important part of the brain-computer interface (BCI), allowing people with mobility problems to communicate with the outside world via assistive devices. However, EEG decoding is a challenging task because of its complexity, dynamic nature, and low signal-to-noise ratio. Designing an end-to-end framework that fully extracts the high-level features of EEG signals remains a challenge. In this study, we present a parallel spatial-temporal self-attention-based convolutional neural network for four-class MI EEG signal classification. This study is the first to define a new spatial-temporal representation of raw EEG signals that uses the self-attention mechanism to extract distinguishable spatial-temporal features. Specifically, we use the spatial self-attention module to capture the spatial dependencies between the channels of MI EEG signals. This module updates each channel by aggregating features over all channels with a weighted summation, thus improving the classification accuracy and eliminating the artifacts caused by manual channel selection. Furthermore, the temporal self-attention module encodes the global temporal information into features for each sampling time step, so that the high-level temporal features of the MI EEG signals can be extracted in the time domain. Quantitative analysis shows that our method outperforms state-of-the-art methods for intra-subject and inter-subject classification, demonstrating its robustness and effectiveness. In terms of qualitative analysis, we perform a visual inspection of the new spatial-temporal representation estimated from the learned architecture. Finally, the proposed method is employed to realize control of drones based on EEG signal, verifying its feasibility in real-time applications. |
Author | Xiong, Peng Liu, Jing Liu, Xiuling Yang, Jianli Lin, Feng Shen, Yonglong |
AuthorAffiliation | 2 Key Laboratory of Digital Medical Engineering of Hebei Province, Hebei University , Baoding , China 4 Beijing Key Laboratory of Mobile Computing and Pervasive Device, Institute of Computing Technology, Chinese Academy of Sciences , Beijing , China 1 College of Electronic Information Engineering, Hebei University , Baoding , China 3 College of Computer and Cyber Security, Hebei Normal University , Shijiazhuang , China 5 School of Computer Science and Engineering, Nanyang Technological University , Singapore , Singapore |
AuthorAffiliation_xml | – name: 3 College of Computer and Cyber Security, Hebei Normal University , Shijiazhuang , China – name: 2 Key Laboratory of Digital Medical Engineering of Hebei Province, Hebei University , Baoding , China – name: 4 Beijing Key Laboratory of Mobile Computing and Pervasive Device, Institute of Computing Technology, Chinese Academy of Sciences , Beijing , China – name: 1 College of Electronic Information Engineering, Hebei University , Baoding , China – name: 5 School of Computer Science and Engineering, Nanyang Technological University , Singapore , Singapore |
Author_xml | – sequence: 1 givenname: Xiuling surname: Liu fullname: Liu, Xiuling – sequence: 2 givenname: Yonglong surname: Shen fullname: Shen, Yonglong – sequence: 3 givenname: Jing surname: Liu fullname: Liu, Jing – sequence: 4 givenname: Jianli surname: Yang fullname: Yang, Jianli – sequence: 5 givenname: Peng surname: Xiong fullname: Xiong, Peng – sequence: 6 givenname: Feng surname: Lin fullname: Lin, Feng |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/33362458$$D View this record in MEDLINE/PubMed |
BookMark | eNp9ksluFDEQhi0URBZ4AC6oJS5cevDabl-QklYCI2VBSpDgZLm9DB6524PdEym3vANvyJPgzIQoyYGTraqvftXy74OdMY4WgLcIzghpxUc3-jHPMMRwxlrOMHwB9lDT4Joy8n3n0X8X7Oe8hLDBLcWvwC4hpMGUtXvgx1eVVAg2VJcrNXkV_tz-vrLDKpZodWmDqw-nyY6Tj2PVnZ_XRypbU53FKaZqPqiFTTdVF1TO3nmtNpgrqaNu_hq8dCpk--b-PQDfTo6vui_16cXneXd4WmvGxFRzQSAiAjnTOKcxJry1RkDbM0qoExwxbJDrlaOU9dD0RAhqueAGK0GpNuQAzLe6JqqlXCU_qHQjo_JyE4hpIVWavA5WEo2JY1BgLjCFAiliDCeqb3kjtBG8aH3aaq3W_WCNLoOXPTwRfZoZ_U-5iNeScyaaRhSBD_cCKf5a2zzJwWdtQ1CjjessMeWEIowgK-j7Z-gyrtNYVlWoRlBMKaKFeve4o4dW_l2wAGgL6BRzTtY9IAjKO5fIjUvknUvk1iWlhj-r0X7aHK8M5cN_Kv8CdYPCyg |
CitedBy_id | crossref_primary_10_1109_ACCESS_2021_3085865 crossref_primary_10_1016_j_neuroimage_2025_121123 crossref_primary_10_1109_TNSRE_2023_3241846 crossref_primary_10_3390_brainsci13020268 crossref_primary_10_3390_diagnostics12040995 crossref_primary_10_1109_TNSRE_2023_3242280 crossref_primary_10_3390_s24237611 crossref_primary_10_1186_s13040_023_00336_y crossref_primary_10_1007_s00521_021_06352_5 crossref_primary_10_1016_j_bspc_2021_102983 crossref_primary_10_3390_math9243297 crossref_primary_10_1016_j_jneumeth_2024_110356 crossref_primary_10_1109_ACCESS_2024_3392008 crossref_primary_10_3390_electronics11223674 crossref_primary_10_1016_j_bspc_2021_102826 crossref_primary_10_1016_j_bspc_2024_106092 crossref_primary_10_3390_bioengineering9070323 crossref_primary_10_1109_TNSRE_2022_3198041 crossref_primary_10_1016_j_measurement_2024_115500 crossref_primary_10_1109_TIM_2024_3375980 crossref_primary_10_1016_j_bspc_2025_107491 crossref_primary_10_1109_TBME_2023_3274231 crossref_primary_10_1016_j_bspc_2022_104141 crossref_primary_10_1016_j_bspc_2024_106206 crossref_primary_10_1109_TNSRE_2023_3342331 crossref_primary_10_1007_s11760_023_02986_1 crossref_primary_10_1088_1741_2552_ac93b4 crossref_primary_10_1109_THMS_2023_3332209 crossref_primary_10_3390_bios12010022 crossref_primary_10_3934_mbe_2023116 crossref_primary_10_1016_j_jneumeth_2023_109953 crossref_primary_10_12677_CSA_2024_143052 crossref_primary_10_1016_j_neucom_2024_128577 crossref_primary_10_1364_BOE_516063 crossref_primary_10_1016_j_compbiomed_2023_107254 crossref_primary_10_1088_1741_2552_acfe9c crossref_primary_10_1109_ACCESS_2023_3285236 crossref_primary_10_1007_s11042_023_16536_x crossref_primary_10_3934_mbe_2022325 crossref_primary_10_1038_s41598_024_79202_8 |
Cites_doi | 10.3389/fnins.2019.01275 10.1007/978-3-030-32520-6_22 10.1016/j.bspc.2018.12.027 10.1152/jn.01113.2002 10.1109/TNNLS.2018.2789927 10.1109/TMI.2016.2520024 10.1016/S1388-2457(01)00661-7 10.1109/TNSRE.2019.2938295 10.1109/TNSRE.2019.2915621 10.1002/hbm.23730 10.1109/TNSRE.2019.2923315 10.1109/TNSRE.2016.2601240 10.1109/TPAMI.2017.2730871 10.3389/fnins.2012.00055 10.1088/1741-2552/aab2f2 10.1109/APWC-on-CSE.2016.017 10.1088/1741-2560/14/1/016003 10.1088/1741-2552/aace8c 10.1016/S1388-2457(98)00038-8 10.1016/j.patrec.2017.03.023 10.1016/S0013-4694(97)00080-1 10.1007/978-3-319-67361-5_40 10.1016/j.neunet.2019.07.008 10.1016/j.future.2019.06.027 10.1109/TNSRE.2019.2953121 10.1080/00222895.2017.1327417 10.1016/j.clinph.2008.06.019 10.1109/TNSRE.2017.2655542 10.1016/j.neuroimage.2013.10.027 10.1016/S1388-2457(99)00141-8 10.1109/86.895946 |
ContentType | Journal Article |
Copyright | Copyright © 2020 Liu, Shen, Liu, Yang, Xiong and Lin. 2020. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. Copyright © 2020 Liu, Shen, Liu, Yang, Xiong and Lin. 2020 Liu, Shen, Liu, Yang, Xiong and Lin |
Copyright_xml | – notice: Copyright © 2020 Liu, Shen, Liu, Yang, Xiong and Lin. – notice: 2020. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: Copyright © 2020 Liu, Shen, Liu, Yang, Xiong and Lin. 2020 Liu, Shen, Liu, Yang, Xiong and Lin |
DBID | AAYXX CITATION NPM 3V. 7XB 88I 8FE 8FH 8FK ABUWG AFKRA AZQEC BBNVY BENPR BHPHI CCPQU DWQXO GNUQQ HCIFZ LK8 M2P M7P PHGZM PHGZT PIMPY PKEHL PQEST PQGLB PQQKQ PQUKI PRINS Q9U 7X8 5PM DOA |
DOI | 10.3389/fnins.2020.587520 |
DatabaseName | CrossRef PubMed ProQuest Central (Corporate) ProQuest Central (purchase pre-March 2016) Science Database (Alumni Edition) ProQuest SciTech Collection ProQuest Natural Science Collection ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni Edition) ProQuest Central UK/Ireland ProQuest Central Essentials Biological Science Collection ProQuest Central Natural Science Collection ProQuest One Community College ProQuest Central Korea ProQuest Central Student SciTech Premium Collection ProQuest Biological Science Collection Science Database Biological Science Database ProQuest Central Premium ProQuest One Academic (New) Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China ProQuest Central Basic MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef PubMed Publicly Available Content Database ProQuest Central Student ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest Natural Science Collection ProQuest Central China ProQuest Central ProQuest One Applied & Life Sciences Natural Science Collection ProQuest Central Korea Biological Science Collection ProQuest Central (New) ProQuest Science Journals (Alumni Edition) ProQuest Biological Science Collection ProQuest Central Basic ProQuest Science Journals ProQuest One Academic Eastern Edition Biological Science Database ProQuest SciTech Collection ProQuest One Academic UKI Edition ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) MEDLINE - Academic |
DatabaseTitleList | Publicly Available Content Database MEDLINE - Academic PubMed CrossRef |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 3 dbid: BENPR name: ProQuest Central url: https://www.proquest.com/central sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Anatomy & Physiology |
EISSN | 1662-453X |
ExternalDocumentID | oai_doaj_org_article_3c23f50927924091a3dd73ab8769cd97 PMC7759669 33362458 10_3389_fnins_2020_587520 |
Genre | Journal Article |
GrantInformation_xml | – fundername: National Natural Science Foundation of China grantid: 61802109; 61673158 – fundername: Natural Science Foundation of Hebei Province grantid: F2020205006 |
GroupedDBID | --- 29H 2WC 53G 5GY 5VS 88I 8FE 8FH 9T4 AAFWJ AAYXX ABUWG ACGFO ACGFS ACXDI ADRAZ AEGXH AENEX AFKRA AFPKN AIAGR ALMA_UNASSIGNED_HOLDINGS AZQEC BBNVY BENPR BHPHI BPHCQ CCPQU CITATION CS3 DIK DU5 DWQXO E3Z EBS EJD EMOBN F5P FRP GNUQQ GROUPED_DOAJ GX1 HCIFZ HYE KQ8 LK8 M2P M48 M7P O5R O5S OK1 OVT P2P PGMZT PHGZM PHGZT PIMPY PQQKQ PROAC RNS RPM W2D C1A IAO IEA IHR ISR M~E NPM 3V. 7XB 8FK PKEHL PQEST PQGLB PQUKI PRINS Q9U 7X8 5PM PUEGO |
ID | FETCH-LOGICAL-c559t-79301391fd6ffc22378ed90eb5434f97152d1fbaf445b0db3994e797d2a944cd3 |
IEDL.DBID | BENPR |
ISSN | 1662-453X 1662-4548 |
IngestDate | Wed Aug 27 01:24:44 EDT 2025 Thu Aug 21 18:28:49 EDT 2025 Fri Jul 11 02:10:15 EDT 2025 Fri Jul 25 11:43:55 EDT 2025 Thu Jan 02 22:58:33 EST 2025 Tue Jul 01 01:39:16 EDT 2025 Thu Apr 24 23:09:36 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Keywords | spatial-temporal self-attention deep learning BCI motor imagery EEG |
Language | English |
License | Copyright © 2020 Liu, Shen, Liu, Yang, Xiong and Lin. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c559t-79301391fd6ffc22378ed90eb5434f97152d1fbaf445b0db3994e797d2a944cd3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 This article was submitted to Brain Imaging Methods, a section of the journal Frontiers in Neuroscience Reviewed by: Davide Valeriani, Massachusetts Eye & Ear Infirmary, Harvard Medical School, United States; Jacobo Fernandez-Vargas, University of Essex, United Kingdom Edited by: Saugat Bhattacharyya, Ulster University, United Kingdom |
OpenAccessLink | https://www.proquest.com/docview/2469424414?pq-origsite=%requestingapplication% |
PMID | 33362458 |
PQID | 2469424414 |
PQPubID | 4424402 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_3c23f50927924091a3dd73ab8769cd97 pubmedcentral_primary_oai_pubmedcentral_nih_gov_7759669 proquest_miscellaneous_2473412105 proquest_journals_2469424414 pubmed_primary_33362458 crossref_primary_10_3389_fnins_2020_587520 crossref_citationtrail_10_3389_fnins_2020_587520 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2020-12-11 |
PublicationDateYYYYMMDD | 2020-12-11 |
PublicationDate_xml | – month: 12 year: 2020 text: 2020-12-11 day: 11 |
PublicationDecade | 2020 |
PublicationPlace | Switzerland |
PublicationPlace_xml | – name: Switzerland – name: Lausanne |
PublicationTitle | Frontiers in neuroscience |
PublicationTitleAlternate | Front Neurosci |
PublicationYear | 2020 |
Publisher | Frontiers Research Foundation Frontiers Media S.A |
Publisher_xml | – name: Frontiers Research Foundation – name: Frontiers Media S.A |
References | Ehrsson (B6) 2003; 90 Qiu (B27) 2017; 25 Vaswani (B37) 2017 De (B5) 2017; 40 Li (B16) 2019; 27 Jin (B11) 2019; 118 Gramfort (B10) 2014; 86 Lin (B17) 2017 Ma (B21) 2017 Tabar (B35) 2016; 14 Schirrmeister (B30) 2017; 38 Zhang (B39) 2019 Lotte (B18) 2018; 15 Lu (B19) 2016; 25 Sakhavi (B29) 2018; 29 Zhu (B41) 2018 Kumar (B14) 2016 Amin (B1) 2019; 101 Gong (B9) 2018; 50 Neuper (B24) 2001; 112 Zhu (B42) 2019; 49 Lawhern (B15) 2018; 15 B33 Fu (B7) 2019 Kumar (B13) 2016; 35 Ma (B20) 2018 Ang (B3) 2008 Kübler (B12) 2008; 119 Tangermann (B36) 2012; 6 Sharma (B32) 2017; 94 Zhao (B40) 2019; 27 Müller-Gerking (B23) 1999; 110 Ramoser (B28) 2000; 8 Glorot (B8) 2010 Pfurtscheller (B26) 1997; 103 Shah (B31) 2018 Song (B34) 2019 Wu (B38) 2019; 13 Ang (B2) 2012 Ma (B22) 2020; 28 Pfurtscheller (B25) 1999; 110 Azab (B4) 2019; 27 |
References_xml | – volume: 13 start-page: 1275 year: 2019 ident: B38 article-title: A parallel multiscale filter bank convolutional neural networks for motor imagery EEG classification publication-title: Front. Neurosci doi: 10.3389/fnins.2019.01275 – start-page: 5998 volume-title: Advances in Neural Information Processing Systems year: 2017 ident: B37 article-title: “Attention is all you need,” – year: 2018 ident: B41 article-title: Negative log likelihood ratio loss for deep neural network classification publication-title: arXiv preprint arXiv:1804.10690 doi: 10.1007/978-3-030-32520-6_22 – volume: 49 start-page: 396 year: 2019 ident: B42 article-title: Separated channel convolutional neural network to realize the training free motor imagery BCI systems publication-title: Biomed. Signal Process. Control doi: 10.1016/j.bspc.2018.12.027 – volume: 90 start-page: 3304 year: 2003 ident: B6 article-title: Imagery of voluntary movement of fingers, toes, and tongue activates corresponding body-part-specific motor representations publication-title: J. Neurophysiol doi: 10.1152/jn.01113.2002 – start-page: 249 volume-title: Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics year: 2010 ident: B8 article-title: “Understanding the difficulty of training deep feedforward neural networks,” – start-page: 136 volume-title: International Conference on Medical Image Computing and Computer-Assisted Intervention year: 2017 ident: B21 article-title: “Nonlinear statistical shape modeling for ankle bone segmentation using a novel Kernelized robust PCA,” – volume: 29 start-page: 5619 year: 2018 ident: B29 article-title: Learning temporal information for brain-computer interface using convolutional neural networks publication-title: IEEE Trans. Neural Netw. Learn. Syst doi: 10.1109/TNNLS.2018.2789927 – start-page: 3146 volume-title: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition year: 2019 ident: B7 article-title: “Dual attention network for scene segmentation,” – start-page: 480 volume-title: International Conference on Medical Image Computing and Computer-Assisted Intervention year: 2018 ident: B20 article-title: “A novel Bayesian model incorporating deep neural network and statistical shape model for pancreas segmentation,” – volume: 35 start-page: 1555 year: 2016 ident: B13 article-title: Mixed spectrum analysis on fMRI time-series publication-title: IEEE Trans. Med. Imaging doi: 10.1109/TMI.2016.2520024 – volume: 112 start-page: 2084 year: 2001 ident: B24 article-title: Evidence for distinct beta resonance frequencies in human EEG related to specific sensorimotor cortical areas publication-title: Clin. Neurophysiol doi: 10.1016/S1388-2457(01)00661-7 – start-page: 39 volume-title: Front. Neurosci year: 2012 ident: B2 article-title: Filter bank common spatial pattern algorithm on BCI competition IV datasets 2a and 2b – volume: 27 start-page: 2164 year: 2019 ident: B40 article-title: A multi-branch 3D convolutional neural network for EEG-based motor imagery classification publication-title: IEEE Trans. Neural Syst. Rehabil. Eng doi: 10.1109/TNSRE.2019.2938295 – volume: 27 start-page: 1170 year: 2019 ident: B16 article-title: A channel-projection mixed-scale convolutional neural network for motor imagery EEG decoding publication-title: IEEE Trans. Neural Syst. Rehabil. Eng doi: 10.1109/TNSRE.2019.2915621 – volume: 38 start-page: 5391 year: 2017 ident: B30 article-title: Deep learning with convolutional neural networks for EEG decoding and visualization publication-title: Hum. Brain Mapp doi: 10.1002/hbm.23730 – volume: 27 start-page: 1352 year: 2019 ident: B4 article-title: Weighted transfer learning for improving motor imagery-based brain-computer interface publication-title: IEEE Trans. Neural Syst. Rehabil. Eng doi: 10.1109/TNSRE.2019.2923315 – volume: 25 start-page: 566 year: 2016 ident: B19 article-title: A deep learning scheme for motor imagery classification based on restricted Boltzmann machines publication-title: IEEE Trans. Neural Syst. Rehabil. Eng doi: 10.1109/TNSRE.2016.2601240 – volume: 40 start-page: 1770 year: 2017 ident: B5 article-title: Transduction on directed graphs via absorbing random walks publication-title: IEEE Trans. Pattern Anal. Mach. Intell doi: 10.1109/TPAMI.2017.2730871 – start-page: 7354 volume-title: International Conference on Machine Learning year: 2019 ident: B39 article-title: “Self-attention generative adversarial networks,” – year: 2017 ident: B17 article-title: A structured self-attentive sentence embedding publication-title: arXiv preprint arXiv:1703.03130 – ident: B33 – volume: 6 start-page: 55 year: 2012 ident: B36 article-title: Review of the BCI competition IV publication-title: Front. Neurosci doi: 10.3389/fnins.2012.00055 – volume: 15 start-page: 031005 year: 2018 ident: B18 article-title: A review of classification algorithms for EEG-based brain-computer interfaces: a 10 year update publication-title: J. Neural Eng doi: 10.1088/1741-2552/aab2f2 – start-page: 34 volume-title: 2016 3rd Asia-Pacific World Congress on Computer Science and Engineering (APWC on CSE) year: 2016 ident: B14 article-title: “A deep learning approach for motor imagery EEG signal classification,” doi: 10.1109/APWC-on-CSE.2016.017 – volume: 14 start-page: 016003 year: 2016 ident: B35 article-title: A novel deep learning approach for classification of EEG motor imagery signals publication-title: J. Neural Eng doi: 10.1088/1741-2560/14/1/016003 – volume: 15 start-page: 056013 year: 2018 ident: B15 article-title: EEGNet: a compact convolutional neural network for EEG-based brain-computer interfaces publication-title: J. Neural Eng doi: 10.1088/1741-2552/aace8c – volume: 110 start-page: 787 year: 1999 ident: B23 article-title: Designing optimal spatial filters for single-trial EEG classification in a movement task publication-title: Clin. Neurophysiol doi: 10.1016/S1388-2457(98)00038-8 – volume: 94 start-page: 172 year: 2017 ident: B32 article-title: Adam: a method for stochastic optimization publication-title: Pattern Recogn. Lett doi: 10.1016/j.patrec.2017.03.023 – volume: 103 start-page: 642 year: 1997 ident: B26 article-title: EEG-based discrimination between imagination of right and left hand movement publication-title: Electroencephalogr. Clin. Neurophysiol doi: 10.1016/S0013-4694(97)00080-1 – start-page: 621 volume-title: Field and Service Robotics year: 2018 ident: B31 article-title: “AirSim: High-fidelity visual and physical simulation for autonomous vehicles,” doi: 10.1007/978-3-319-67361-5_40 – volume: 118 start-page: 262 year: 2019 ident: B11 article-title: Correlation-based channel selection and regularized feature optimization for MI-based BCI publication-title: Neural Netw doi: 10.1016/j.neunet.2019.07.008 – volume: 101 start-page: 542 year: 2019 ident: B1 article-title: Deep learning for EEG motor imagery classification based on multi-layer CNNs feature fusion publication-title: Future Gener. Comput. Syst doi: 10.1016/j.future.2019.06.027 – volume: 28 start-page: 297 year: 2020 ident: B22 article-title: Deep channel-correlation network for motor imagery decoding from the same limb publication-title: IEEE Trans. Neural Syst. Rehab. Eng. doi: 10.1109/TNSRE.2019.2953121 – volume: 50 start-page: 254 year: 2018 ident: B9 article-title: Time-frequency cross mutual information analysis of the brain functional networks underlying multiclass motor imagery publication-title: J. Motor Behav doi: 10.1080/00222895.2017.1327417 – volume: 119 start-page: 2658 year: 2008 ident: B12 article-title: Brain-computer interfaces and communication in paralysis: extinction of goal directed thinking in completely paralysed patients? publication-title: Clin. Neurophysiol doi: 10.1016/j.clinph.2008.06.019 – start-page: 1 volume-title: 2019 International Joint Conference on Neural Networks (IJCNN) year: 2019 ident: B34 article-title: “EEG-based motor imagery classification with deep multi-task learning,” – volume: 25 start-page: 1009 year: 2017 ident: B27 article-title: Optimized motor imagery paradigm based on imagining Chinese characters writing movement publication-title: IEEE Trans. Neural Syst. Rehabil. Eng doi: 10.1109/TNSRE.2017.2655542 – start-page: 2390 year: 2008 ident: B3 article-title: “Filter bank common spatial pattern (FBCSP) in brain-computer interface,” publication-title: 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence) – volume: 86 start-page: 446 year: 2014 ident: B10 article-title: MNE software for processing MEG and EEG data publication-title: Neuroimage doi: 10.1016/j.neuroimage.2013.10.027 – volume: 110 start-page: 1842 year: 1999 ident: B25 article-title: Event-related EEG/MEG synchronization and desynchronization: basic principles publication-title: Clin. Neurophysiol doi: 10.1016/S1388-2457(99)00141-8 – volume: 8 start-page: 441 year: 2000 ident: B28 article-title: Optimal spatial filtering of single trial eeg during imagined hand movement publication-title: IEEE Trans. Rehabil. Eng doi: 10.1109/86.895946 |
SSID | ssj0062842 |
Score | 2.4456441 |
Snippet | Motor imagery (MI) electroencephalography (EEG) classification is an important part of the brain-computer interface (BCI), allowing people with mobility... Motor imagery (MI) electroencephalography (EEG) classification is an important part of the brain--computer interface (BCI), allowing people with mobility... |
SourceID | doaj pubmedcentral proquest pubmed crossref |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source |
StartPage | 587520 |
SubjectTerms | Accuracy BCI Classification Deep learning EEG Electroencephalography Feature selection Mental task performance Methods motor imagery Neural networks Neuroscience spatial-temporal self-attention Temporal variations |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1NT9wwELUQJy6ohbakpciVqh6QAnHsxPFxd1UElVhxAAlOkT9VpCVboXDgxn_gH_JLOmNnV7tVRS-9Oo7ieMaZ9-LxG0K-Kmm8CJznrPRAUBgQFF2pIpdK-QLCMSBePOB8Pq1Pr8SP6-p6pdQX5oQleeA0ccfcljxAVEOhO-AiTHPnJNcGVrGyTsVz5NC8IFPpG1zDR7dMe5hAwdRx6G471OYui6MKADoW916JQlGs_28I889EyZXIc_KGbA-QkY7SUN-SDd_tkN1RB3T57pF-ozGJM_4d3yU3F_oeq6PMKNYaBt96eXq-TOpT0ORnIR_1fcpwpJPpNB9DEHP0fA7Mm57doZ7FI411MjGDKBqNAqql48nZO3J18v1ycpoP1RNyCyyhh9nmCO9YcHUIFlCAbLxThTd4mDQoCYHbsWB0EKIyhTOAVISXSrpSKyGs4-_JZjfv_B6hEgKpL42tNW-ELWxTN05bY2RlmG20zEixmM3WDtLiWOFi1gLFQAO00QAtGqBNBsjI4fKWX0lX47XOYzTRsiNKYscGcJR2cJT2X46Skf2FgdthncJDRK3wqB8TGfmyvAwrDLdNdOfnD9hHQqgHalxl5EPyh-VIOAcAIKomI3LNU9aGun6lu_0ZVbylrIBqqo__490-kS2cLkyzYWyfbPb3D_4zgKXeHMR18RvALBJL priority: 102 providerName: Directory of Open Access Journals – databaseName: Scholars Portal Journals: Open Access dbid: M48 link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1db9MwFLWm8cILYoyPwEBGQjwgZSSxE8cPCLUV04bUiodVGk-WP9mkLoUuk-gb_4F_yC_hXieNKKr2mriV5Xvdc059fS4hb6QwngfG0rzwIFByECi6lFkqpPQZwDEwXrzgPJ1Vp3P--aK82COb9lb9At7slHbYT2q-Whz__LH-CBv-AypOwNv3oblq0Hm7yI5LoN8FKPh7AEwC9-mUD4cKFfwSx8PPCi8KAVPvDjl3f8UWTEU3_10U9P9Kyn-g6eQhedBzSjrqkuCA7PnmETkcNaCnr9f0LY1VnvHv80Py9YteYfuUBcVmxJB8f379Pu_sqeCRX4R01LZdCSSdzGbpGFDO0ekSpDk9u0bDizWNjTSxxChGlQLtpePJ2WMyP_l0PjlN-_YKqQUZ0UI4GPK_PLgqBAs0QdTeycwbvG0apABkd3kwOnBemswZoDLcCylcoSXn1rEnZL9ZNv4ZoQKQ1hfGVprV3Ga2rmqnrTGiNLmttUhItllNZXvvcWyBsVCgQTAAKgZAYQBUF4CEvBs-8r0z3rhr8BhDNAxEz-z4YLn6pvotqJgtWAB-hJaJoGpzzZwTTBvAA2mdhEkebQKsNnmoCl5JvAuY84S8Hl7DFsRzFd345S2OEcAFQDuXCXna5cMwE8aAIfCyTojYypStqW6_aa4uo823ECVoUfn87mm9IPdxIbDCJs-PyH67uvUvgSe15lXM_r_BphB5 priority: 102 providerName: Scholars Portal |
Title | Parallel Spatial–Temporal Self-Attention CNN-Based Motor Imagery Classification for BCI |
URI | https://www.ncbi.nlm.nih.gov/pubmed/33362458 https://www.proquest.com/docview/2469424414 https://www.proquest.com/docview/2473412105 https://pubmed.ncbi.nlm.nih.gov/PMC7759669 https://doaj.org/article/3c23f50927924091a3dd73ab8769cd97 |
Volume | 14 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV3NbtQwELagvXBBQPlZKJWREAek0Dh24viEdlctLdKuKtRKyynyL1TaJu02PfTGO_CGPAkzTjbqItRLDo6jWJ7xzPfZ4xlC3itpvAicJyzzQFAYEBSdqzSRSvkU3DEgXrzgPJsXR2fi6yJf9Btu131Y5domRkPtGot75PsZ8Di8lMXE58urBKtG4elqX0LjIdkGE1wC-dqeHMxPvq1tcQHGN553Fng3CMB5d64JtEzth_q8xnzdWfopB9COBb_veKaYwP9_qPPf4Mk73ujwCXncw0g67uT-lDzw9TOyM66BQl_c0g80BnbGHfMd8v1Er7BiypJi_WHQtz-_fp92GamgyS9DMm7bLuqRTufzZAKOzdFZA2ycHl9gjotbGmtnYlRRFCQFpEsn0-Pn5Ozw4HR6lPQVFRILzKEFCXCEfCy4IgQLyECW3qnUG7xgGpQEZ-5YMDoIkZvUGUAvwkslXaaVENbxF2Srbmr_ilAJztVnxhaal8KmtixKp60xMjfMllqOSLqezcr26cax6sWyAtqBAqiiACoUQNUJYEQ-Dp9cdrk27us8QRENHTFNdmxoVj-qftVV3GY8ACTCLIlAZJnmzkmuDbgAZZ2CQe6uBVz1axd-MmjaiLwbXsOqw6MUXfvmBvtIcP9Al_MRednpwzASzgEUiLwcEbmhKRtD3XxTn_-Mmb2lzIF-qtf3D-sNeYQTgUE1jO2SrXZ1498CNGrNXq__e3FrAZ5fFgyeM1H-BfehELo |
linkProvider | ProQuest |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9NAEB6V9AAXBJSHocAiAQckU9u79noPCCWhVUKbqEKpVE6u92GolDolpEK58R_4H_wofgkzfkQNQr31aq_t1c7szPd55wHwUkntRMG5H0YOCUqIBCWPVeBLpVyA7hgRLyU4j8bJ4Eh8PI6PN-B3mwtDYZWtTawMtZ0Z-ke-EyGPo6SsULw__-ZT1yg6XW1baNRqse-WP5CyfX83_IDyfRVFe7uT_sBvugr4BtHzAmfBCfaEhU2KwqB3lKmzKnCakiwLJdGh2bDQeSFErAOr0YMLJ5W0Ua6EMJbje2_ApuBIZTqw2dsdH35qbX-Cxr46X00oFwnJQH2OijRQ7RTlaUn1waPgbYwkgRqMX_KEVcOA_6Hcf4M1L3m_vTtwu4GtrFvr2V3YcOU92OqWSNnPluw1qwJJqz_0W_D5MJ9Th5Ypo37HqN9_fv6a1BWw8JKbFn53saijLFl_PPZ76EgtG82Q_bPhGdXUWLKqVydFMVWKwxBZs15_eB-OrmWtH0CnnJXuETCJztxF2iQ5T4UJTJqkNjday1iHJs2lB0G7mplpyptTl41phjSHBJBVAshIAFktAA_erB45r2t7XDW4RyJaDaSy3NWF2fxL1uzyjJuIFwjBqCojEucw59ZKnmt0OcpYhZPcbgWcNbYCP7LSbA9erG7jLqejm7x0swsaIxFuID2PPXhY68NqJpwjCBFx6oFc05S1qa7fKU-_VpXEpYyR7qrHV0_rOdwcTEYH2cFwvP8EbtGiUEBPGG5DZzG_cE8Rli30s2YvMDi57u33F9szSnU |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9NAEF6VVEJcEFAehgKLBByQTGyv7fUeEErSRg2lVoRaqZxc7wsqpXYJrlBu_Af-DT-HX8KMX2oQ6q1Xe22vdmb2m887D0JeCi5NaBlz_cAAQfGBoOSR8FwuhPEAjsHjxQTngzTeOwo_HEfHG-R3lwuDYZXdnlhv1LpU-I98GACPw6QsPxzaNixivjN9f_7NxQ5SeNLatdNoVGTfrH4Affv-brYDsn4VBNPdw8me23YYcBV40hXMiKEL5FsdW6sAKXlitPCMxIRLKziAm_atzG0YRtLTEtA8NFxwHeQiDJVm8N4bZJMDK_IGZHO8m84_dTgQw8Zfn7XGmJcExKA5UwVKKIa2OC2wVnjgvY2AMGCz8UuoWDcP-J_H-2_g5iUknN4ht1sXlo4anbtLNkxxj2yNCqDvZyv6mtZBpfXf-i3yeZ4vsVvLgmLvY9D1Pz9_HTbVsOCSWVh3VFVNxCWdpKk7BlDV9KCsyiWdnWF9jRWt-3ZiRFOtRBS8bDqezO6To2tZ6wdkUJSFeUQoB2A3gVRxzpJQeSqJE50rKXkkfZXk3CFet5qZakudY8eNRQaUBwWQ1QLIUABZIwCHvOkfOW_qfFw1eIwi6gdiie76Qrn8krUWnzEVMAvuGFZoBBLt50xrznIJ8COUFjDJ7U7AWbtvwEd6LXfIi_42WDwe4-SFKS9wDAfXA6h65JCHjT70M2EMHJIwShzC1zRlbarrd4rTr3VVcc4joL7i8dXTek5ugtllH2fp_hNyC9cEY3t8f5sMquWFeQoeWiWftaZAycl1W99fLBNOqg |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Parallel+Spatial%E2%80%93Temporal+Self-Attention+CNN-Based+Motor+Imagery+Classification+for+BCI&rft.jtitle=Frontiers+in+neuroscience&rft.au=Liu%2C+Xiuling&rft.au=Shen%2C+Yonglong&rft.au=Liu%2C+Jing&rft.au=Yang%2C+Jianli&rft.date=2020-12-11&rft.pub=Frontiers+Research+Foundation&rft.issn=1662-4548&rft.eissn=1662-453X&rft_id=info:doi/10.3389%2Ffnins.2020.587520&rft.externalDBID=HAS_PDF_LINK |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1662-453X&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1662-453X&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1662-453X&client=summon |