Wearable Supernumerary Robotic Limb System Using a Hybrid Control Approach Based on Motor Imagery and Object Detection

Motor disorder of upper limbs has seriously affected the daily life of the patients with hemiplegia after stroke. We developed a wearable supernumerary robotic limb (SRL) system using a hybrid control approach based on motor imagery (MI) and object detection for upper-limb motion assistance. SRL sys...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on neural systems and rehabilitation engineering Vol. 30; pp. 1298 - 1309
Main Authors Tang, Zhichuan, Zhang, Lingtao, Chen, Xin, Ying, Jichen, Wang, Xinyang, Wang, Hang
Format Journal Article
LanguageEnglish
Published United States IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Motor disorder of upper limbs has seriously affected the daily life of the patients with hemiplegia after stroke. We developed a wearable supernumerary robotic limb (SRL) system using a hybrid control approach based on motor imagery (MI) and object detection for upper-limb motion assistance. SRL system included an SRL hardware subsystem and a hybrid control software subsystem. The system obtained the patient's motion intention through MI electroencephalogram (EEG) recognition method based on graph convolutional network (GCN) and gated recurrent unit network (GRU) to control the left and right movements of SRL, and the object detection technology was used together for a quick grasp of target objects to compensate for the disadvantages when using MI EEG alone like fewer control instructions and lower control efficiency. Offline training experiment was designed to obtain subjects' MI recognition models and evaluate the feasibility of the MI EEG recognition method; online control experiment was designed to verify the effectiveness of our wearable SRL system. The results showed that the proposed MI EEG recognition method (GCN+GRU) could effectively improve the MI classification accuracy (90.04% <inline-formula> <tex-math notation="LaTeX">\pm ~2.36 </tex-math></inline-formula>%) compared with traditional methods; all subjects were able to complete the target object grasping tasks within 23 seconds by controlling the SRL, and the highest average grasping success rate achieved 90.67% in bag grasping task. The SRL system can effectively assist people with upper-limb motor disorder to perform upper-limb tasks in daily life by natural human-robot interaction, and improve their ability of self-help and enhance their confidence of life.
AbstractList Motor disorder of upper limbs has seriously affected the daily life of the patients with hemiplegia after stroke. We developed a wearable supernumerary robotic limb (SRL) system using a hybrid control approach based on motor imagery (MI) and object detection for upper-limb motion assistance. SRL system included an SRL hardware subsystem and a hybrid control software subsystem. The system obtained the patient's motion intention through MI electroencephalogram (EEG) recognition method based on graph convolutional network (GCN) and gated recurrent unit network (GRU) to control the left and right movements of SRL, and the object detection technology was used together for a quick grasp of target objects to compensate for the disadvantages when using MI EEG alone like fewer control instructions and lower control efficiency. Offline training experiment was designed to obtain subjects' MI recognition models and evaluate the feasibility of the MI EEG recognition method; online control experiment was designed to verify the effectiveness of our wearable SRL system. The results showed that the proposed MI EEG recognition method (GCN+GRU) could effectively improve the MI classification accuracy (90.04% ± 2.36 %) compared with traditional methods; all subjects were able to complete the target object grasping tasks within 23 seconds by controlling the SRL, and the highest average grasping success rate achieved 90.67% in bag grasping task. The SRL system can effectively assist people with upper-limb motor disorder to perform upper-limb tasks in daily life by natural human-robot interaction, and improve their ability of self-help and enhance their confidence of life.Motor disorder of upper limbs has seriously affected the daily life of the patients with hemiplegia after stroke. We developed a wearable supernumerary robotic limb (SRL) system using a hybrid control approach based on motor imagery (MI) and object detection for upper-limb motion assistance. SRL system included an SRL hardware subsystem and a hybrid control software subsystem. The system obtained the patient's motion intention through MI electroencephalogram (EEG) recognition method based on graph convolutional network (GCN) and gated recurrent unit network (GRU) to control the left and right movements of SRL, and the object detection technology was used together for a quick grasp of target objects to compensate for the disadvantages when using MI EEG alone like fewer control instructions and lower control efficiency. Offline training experiment was designed to obtain subjects' MI recognition models and evaluate the feasibility of the MI EEG recognition method; online control experiment was designed to verify the effectiveness of our wearable SRL system. The results showed that the proposed MI EEG recognition method (GCN+GRU) could effectively improve the MI classification accuracy (90.04% ± 2.36 %) compared with traditional methods; all subjects were able to complete the target object grasping tasks within 23 seconds by controlling the SRL, and the highest average grasping success rate achieved 90.67% in bag grasping task. The SRL system can effectively assist people with upper-limb motor disorder to perform upper-limb tasks in daily life by natural human-robot interaction, and improve their ability of self-help and enhance their confidence of life.
Motor disorder of upper limbs has seriously affected the daily life of the patients with hemiplegia after stroke. We developed a wearable supernumerary robotic limb (SRL) system using a hybrid control approach based on motor imagery (MI) and object detection for upper-limb motion assistance. SRL system included an SRL hardware subsystem and a hybrid control software subsystem. The system obtained the patient's motion intention through MI electroencephalogram (EEG) recognition method based on graph convolutional network (GCN) and gated recurrent unit network (GRU) to control the left and right movements of SRL, and the object detection technology was used together for a quick grasp of target objects to compensate for the disadvantages when using MI EEG alone like fewer control instructions and lower control efficiency. Offline training experiment was designed to obtain subjects' MI recognition models and evaluate the feasibility of the MI EEG recognition method; online control experiment was designed to verify the effectiveness of our wearable SRL system. The results showed that the proposed MI EEG recognition method (GCN+GRU) could effectively improve the MI classification accuracy (90.04% <inline-formula> <tex-math notation="LaTeX">\pm ~2.36 </tex-math></inline-formula>%) compared with traditional methods; all subjects were able to complete the target object grasping tasks within 23 seconds by controlling the SRL, and the highest average grasping success rate achieved 90.67% in bag grasping task. The SRL system can effectively assist people with upper-limb motor disorder to perform upper-limb tasks in daily life by natural human-robot interaction, and improve their ability of self-help and enhance their confidence of life.
Motor disorder of upper limbs has seriously affected the daily life of the patients with hemiplegia after stroke. We developed a wearable supernumerary robotic limb (SRL) system using a hybrid control approach based on motor imagery (MI) and object detection for upper-limb motion assistance. SRL system included an SRL hardware subsystem and a hybrid control software subsystem. The system obtained the patient’s motion intention through MI electroencephalogram (EEG) recognition method based on graph convolutional network (GCN) and gated recurrent unit network (GRU) to control the left and right movements of SRL, and the object detection technology was used together for a quick grasp of target objects to compensate for the disadvantages when using MI EEG alone like fewer control instructions and lower control efficiency. Offline training experiment was designed to obtain subjects’ MI recognition models and evaluate the feasibility of the MI EEG recognition method; online control experiment was designed to verify the effectiveness of our wearable SRL system. The results showed that the proposed MI EEG recognition method (GCN+GRU) could effectively improve the MI classification accuracy (90.04% [Formula Omitted]%) compared with traditional methods; all subjects were able to complete the target object grasping tasks within 23 seconds by controlling the SRL, and the highest average grasping success rate achieved 90.67% in bag grasping task. The SRL system can effectively assist people with upper-limb motor disorder to perform upper-limb tasks in daily life by natural human-robot interaction, and improve their ability of self-help and enhance their confidence of life.
Motor disorder of upper limbs has seriously affected the daily life of the patients with hemiplegia after stroke. We developed a wearable supernumerary robotic limb (SRL) system using a hybrid control approach based on motor imagery (MI) and object detection for upper-limb motion assistance. SRL system included an SRL hardware subsystem and a hybrid control software subsystem. The system obtained the patient's motion intention through MI electroencephalogram (EEG) recognition method based on graph convolutional network (GCN) and gated recurrent unit network (GRU) to control the left and right movements of SRL, and the object detection technology was used together for a quick grasp of target objects to compensate for the disadvantages when using MI EEG alone like fewer control instructions and lower control efficiency. Offline training experiment was designed to obtain subjects' MI recognition models and evaluate the feasibility of the MI EEG recognition method; online control experiment was designed to verify the effectiveness of our wearable SRL system. The results showed that the proposed MI EEG recognition method (GCN+GRU) could effectively improve the MI classification accuracy (90.04% ± 2.36 %) compared with traditional methods; all subjects were able to complete the target object grasping tasks within 23 seconds by controlling the SRL, and the highest average grasping success rate achieved 90.67% in bag grasping task. The SRL system can effectively assist people with upper-limb motor disorder to perform upper-limb tasks in daily life by natural human-robot interaction, and improve their ability of self-help and enhance their confidence of life.
Motor disorder of upper limbs has seriously affected the daily life of the patients with hemiplegia after stroke. We developed a wearable supernumerary robotic limb (SRL) system using a hybrid control approach based on motor imagery (MI) and object detection for upper-limb motion assistance. SRL system included an SRL hardware subsystem and a hybrid control software subsystem. The system obtained the patient's motion intention through MI electroencephalogram (EEG) recognition method based on graph convolutional network (GCN) and gated recurrent unit network (GRU) to control the left and right movements of SRL, and the object detection technology was used together for a quick grasp of target objects to compensate for the disadvantages when using MI EEG alone like fewer control instructions and lower control efficiency. Offline training experiment was designed to obtain subjects' MI recognition models and evaluate the feasibility of the MI EEG recognition method; online control experiment was designed to verify the effectiveness of our wearable SRL system. The results showed that the proposed MI EEG recognition method (GCN+GRU) could effectively improve the MI classification accuracy (90.04% <tex-math notation="LaTeX">$\pm ~2.36$ </tex-math>%) compared with traditional methods; all subjects were able to complete the target object grasping tasks within 23 seconds by controlling the SRL, and the highest average grasping success rate achieved 90.67% in bag grasping task. The SRL system can effectively assist people with upper-limb motor disorder to perform upper-limb tasks in daily life by natural human-robot interaction, and improve their ability of self-help and enhance their confidence of life.
Author Tang, Zhichuan
Wang, Hang
Chen, Xin
Ying, Jichen
Zhang, Lingtao
Wang, Xinyang
Author_xml – sequence: 1
  givenname: Zhichuan
  orcidid: 0000-0002-1730-1120
  surname: Tang
  fullname: Tang, Zhichuan
  email: ttzzcc@zju.edu.cn
  organization: Industrial Design Institute, Zhejiang University of Technology, Hangzhou, China
– sequence: 2
  givenname: Lingtao
  surname: Zhang
  fullname: Zhang, Lingtao
  organization: Industrial Design Institute, Zhejiang University of Technology, Hangzhou, China
– sequence: 3
  givenname: Xin
  surname: Chen
  fullname: Chen, Xin
  organization: College of Information Engineering, Zhejiang University of Technology, Hangzhou, China
– sequence: 4
  givenname: Jichen
  surname: Ying
  fullname: Ying, Jichen
  organization: Industrial Design Institute, Zhejiang University of Technology, Hangzhou, China
– sequence: 5
  givenname: Xinyang
  surname: Wang
  fullname: Wang, Xinyang
  organization: Industrial Design Institute, Zhejiang University of Technology, Hangzhou, China
– sequence: 6
  givenname: Hang
  surname: Wang
  fullname: Wang, Hang
  organization: Industrial Design Institute, Zhejiang University of Technology, Hangzhou, China
BackLink https://www.ncbi.nlm.nih.gov/pubmed/35511846$$D View this record in MEDLINE/PubMed
BookMark eNp9ks1uEzEUhS1URNvAC4CELLFhk-C_8UyWJRQaKVCpacXSuvbcKY5mxqk9g5S3x2nSLrpgZcv6zvE99jknJ33okZD3nM04Z_Mvt7_WN5czwYSYSV6KealekTNeFNWUCc5O9nuppkoKdkrOU9owxktdlG_IqSwKziulz8jf3wgRbIt0PW4x9mOHEeKO3gQbBu_oyneWrndpwI7eJd_fU6BXOxt9TRehH2Jo6cV2GwO4P_QrJKxp6OnPMIRIlx3cY7aCvqbXdoNuoN9wyIsP_VvyuoE24bvjOiF33y9vF1fT1fWP5eJiNXVK82HKK5DOcuTMzZVjnFWo60aDVbYGrjmoyjpQHBivHasFLx1rlAMpatlUVssJWR586wAbs42-y-FMAG8eD0K8NxBzzhaNqxSTWmODslTzUlgQotCqKEQlNaoqe30-eOW4DyOmwXQ-OWxb6DGMyQit84RMiiKjn16gmzDGPifNVMkEkzJfNiEfj9RoO6yfx3v6nQxUB8DFkFLExjg_wP79hgi-NZyZfRHMYxHMvgjmWIQsFS-kT-7_FX04iDwiPgvmZclYIeQ_FbS8_g
CODEN ITNSB3
CitedBy_id crossref_primary_10_1007_s10846_023_01940_0
crossref_primary_10_1080_10447318_2025_2464915
crossref_primary_10_1109_TNSRE_2025_3551753
crossref_primary_10_3390_bios15020070
crossref_primary_10_1016_j_eswa_2023_121915
crossref_primary_10_1016_j_engappai_2024_109680
crossref_primary_10_1002_aisy_202300448
crossref_primary_10_11834_jig_230031
crossref_primary_10_1109_JBHI_2024_3467090
crossref_primary_10_3390_s24206585
crossref_primary_10_3389_fnbot_2024_1443010
crossref_primary_10_1038_s41598_024_72358_3
crossref_primary_10_1016_j_bspc_2024_106162
crossref_primary_10_1007_s00170_024_14098_2
crossref_primary_10_3390_biomimetics8060479
crossref_primary_10_3390_bioengineering9110682
crossref_primary_10_1016_j_neucom_2023_126901
crossref_primary_10_1016_j_measurement_2023_113447
crossref_primary_10_1007_s11431_024_2806_8
crossref_primary_10_1016_j_bspc_2023_104765
crossref_primary_10_1016_j_aei_2024_102625
crossref_primary_10_3390_biomedicines13030599
crossref_primary_10_1109_TNSRE_2023_3330500
crossref_primary_10_3389_fnhum_2022_1068165
crossref_primary_10_1109_JSEN_2023_3308615
crossref_primary_10_3390_mi14122164
crossref_primary_10_3389_fncom_2022_1010770
crossref_primary_10_3390_bios14050213
crossref_primary_10_1109_OJEMB_2025_3537760
Cites_doi 10.1109/TNNLS.2021.3118468
10.1109/ACCESS.2018.2889093
10.1109/LRA.2021.3058926
10.1109/TNSRE.2006.875642
10.1080/10447318.2016.1267450
10.1109/TITS.2019.2935152
10.1109/TAFFC.2018.2817622
10.1109/TLA.2018.8291481
10.1016/j.robot.2016.10.005
10.1109/TAFFC.2019.2937768
10.1109/CCDC.2018.8408108
10.1109/TNSRE.2019.2950619
10.1109/TNNLS.2018.2876865
10.3389/frobt.2021.661354
10.1109/TNSRE.2020.3038209
10.1109/TNSRE.2020.2984717
10.1088/1741-2552/ab405f
10.1109/LRA.2020.3005629
10.1109/ACCESS.2020.2999133
10.1016/j.compbiomed.2016.08.010
10.1016/j.neunet.2019.07.008
10.1109/LRA.2020.2970948
10.1109/ICMRA51221.2020.9398375
10.1080/10447318.2018.1445068
10.1109/TMRB.2021.3086016
10.1162/neco.1997.9.8.1735
10.1109/TNSRE.2017.2694553
10.1109/TNSRE.2020.3007625
10.1007/s40846-020-00538-3
10.1016/j.ergon.2011.03.005
10.1016/j.ijleo.2016.10.117
10.1145/3349341.3349414
10.1109/SMC.2018.00180
10.1109/CISP.2010.5648081
10.1109/TIM.2020.3047502
10.1016/j.bspc.2020.102172
10.1007/978-3-030-58589-1_21
10.1109/TRO.2016.2520486
10.1016/j.compbiomed.2020.103843
10.1109/TNNLS.2020.3015505
10.1109/TNSRE.2019.2914916
10.1016/S1388-2457(99)00141-8
10.1109/TRO.2015.2506731
10.1007/s11263-020-01401-3
10.3389/fnins.2018.00680
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022
DBID 97E
ESBDL
RIA
RIE
AAYXX
CITATION
NPM
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
NAPCQ
P64
7X8
DOA
DOI 10.1109/TNSRE.2022.3172974
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE Xplore Open Access Journals
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
PubMed
Aluminium Industry Abstracts
Biotechnology Research Abstracts
Ceramic Abstracts
Computer and Information Systems Abstracts
Corrosion Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
Materials Business File
Mechanical & Transportation Engineering Abstracts
Neurosciences Abstracts
Solid State and Superconductivity Abstracts
METADEX
Technology Research Database
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
Aerospace Database
Materials Research Database
ProQuest Computer Science Collection
Civil Engineering Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Nursing & Allied Health Premium
Biotechnology and BioEngineering Abstracts
MEDLINE - Academic
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
PubMed
Materials Research Database
Civil Engineering Abstracts
Aluminium Industry Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Mechanical & Transportation Engineering Abstracts
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Ceramic Abstracts
Neurosciences Abstracts
Materials Business File
METADEX
Biotechnology and BioEngineering Abstracts
Computer and Information Systems Abstracts Professional
Aerospace Database
Nursing & Allied Health Premium
Engineered Materials Abstracts
Biotechnology Research Abstracts
Solid State and Superconductivity Abstracts
Engineering Research Database
Corrosion Abstracts
Advanced Technologies Database with Aerospace
ANTE: Abstracts in New Technology & Engineering
MEDLINE - Academic
DatabaseTitleList MEDLINE - Academic

Materials Research Database
PubMed

Database_xml – sequence: 1
  dbid: DOA
  name: Directory of Open Access Journals (DOAJ)
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 3
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Occupational Therapy & Rehabilitation
EISSN 1558-0210
EndPage 1309
ExternalDocumentID oai_doaj_org_article_c840366efe374972ba22564552836e48
35511846
10_1109_TNSRE_2022_3172974
9770052
Genre orig-research
Journal Article
GrantInformation_xml – fundername: Natural Science Foundation of Zhejiang Province
  grantid: LY20F020028
  funderid: 10.13039/501100004731
– fundername: Philosophy and Social Science Planning Fund Project of Zhejiang Province
  grantid: 22NDJC007Z
– fundername: Key Research and Development Program of Zhejiang Province
  grantid: 2022C03148
– fundername: Fundamental Research Funds for the Provincial Universities of Zhejiang
  grantid: GB201901006
  funderid: 10.13039/501100012226
GroupedDBID ---
-~X
0R~
29I
4.4
53G
5GY
5VS
6IK
97E
AAFWJ
AAJGR
AASAJ
AAWTH
ABAZT
ABVLG
ACGFO
ACGFS
ACIWK
ACPRK
AENEX
AETIX
AFPKN
AFRAH
AGSQL
AIBXA
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
ESBDL
F5P
GROUPED_DOAJ
HZ~
H~9
IFIPE
IPLJI
JAVBF
LAI
M43
O9-
OCL
OK1
P2P
RIA
RIE
RNS
AAYXX
CITATION
RIG
NPM
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
NAPCQ
P64
7X8
ID FETCH-LOGICAL-c461t-18a3cb1e10c94c0108e6df6ab4bda161a48bca41a01dc0d217c0f4ca32d3f8b63
IEDL.DBID DOA
ISSN 1534-4320
1558-0210
IngestDate Wed Aug 27 01:30:13 EDT 2025
Fri Jul 11 10:39:47 EDT 2025
Fri Jul 25 06:59:26 EDT 2025
Mon Jul 21 05:52:57 EDT 2025
Tue Jul 01 00:43:25 EDT 2025
Thu Apr 24 23:09:53 EDT 2025
Wed Aug 27 02:36:17 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Language English
License https://creativecommons.org/licenses/by/4.0/legalcode
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c461t-18a3cb1e10c94c0108e6df6ab4bda161a48bca41a01dc0d217c0f4ca32d3f8b63
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0002-1730-1120
OpenAccessLink https://doaj.org/article/c840366efe374972ba22564552836e48
PMID 35511846
PQID 2670203340
PQPubID 85423
PageCount 12
ParticipantIDs crossref_citationtrail_10_1109_TNSRE_2022_3172974
doaj_primary_oai_doaj_org_article_c840366efe374972ba22564552836e48
pubmed_primary_35511846
ieee_primary_9770052
crossref_primary_10_1109_TNSRE_2022_3172974
proquest_journals_2670203340
proquest_miscellaneous_2660100325
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 20220000
2022-00-00
20220101
2022-01-01
PublicationDateYYYYMMDD 2022-01-01
PublicationDate_xml – year: 2022
  text: 20220000
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: New York
PublicationTitle IEEE transactions on neural systems and rehabilitation engineering
PublicationTitleAbbrev TNSRE
PublicationTitleAlternate IEEE Trans Neural Syst Rehabil Eng
PublicationYear 2022
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref12
hou (ref26) 2020
ref15
ref14
ref52
ref11
ref10
chung (ref31) 2014
ref17
ref16
ref19
ref18
ref51
ref50
redmon (ref42) 2018
ref46
ref45
ref48
ref47
ref43
ref49
ref8
waytowich (ref13) 2010
ref7
ref9
ref4
ref3
ref6
ref5
ref40
ref35
ref34
ref37
ref36
ref30
ref33
ref32
ref2
ref1
ref38
david (ref39) 2020
hou (ref41) 2020
ref24
ref23
ref25
ref20
ref22
ref21
ref28
ref27
ref29
pei (ref44) 2021
References_xml – start-page: 1
  year: 2010
  ident: ref13
  article-title: Robot application of a brain computer interface to staubli tx40 robots-early stages
  publication-title: Proc World Automation Cong
– ident: ref17
  doi: 10.1109/TNNLS.2021.3118468
– ident: ref24
  doi: 10.1109/ACCESS.2018.2889093
– year: 2014
  ident: ref31
  article-title: Empirical evaluation of gated recurrent neural networks on sequence modeling
  publication-title: arXiv 1412 3555
– ident: ref10
  doi: 10.1109/LRA.2021.3058926
– ident: ref46
  doi: 10.1109/TNSRE.2006.875642
– ident: ref34
  doi: 10.1080/10447318.2016.1267450
– year: 2020
  ident: ref39
  article-title: TensorFlow lite micro: Embedded machine learning on TinyML systems
  publication-title: arXiv 2010 08678
– ident: ref32
  doi: 10.1109/TITS.2019.2935152
– ident: ref40
  doi: 10.1109/TAFFC.2018.2817622
– ident: ref15
  doi: 10.1109/TLA.2018.8291481
– ident: ref16
  doi: 10.1016/j.robot.2016.10.005
– ident: ref28
  doi: 10.1109/TAFFC.2019.2937768
– ident: ref22
  doi: 10.1109/CCDC.2018.8408108
– ident: ref11
  doi: 10.1109/TNSRE.2019.2950619
– ident: ref36
  doi: 10.1109/TNNLS.2018.2876865
– ident: ref7
  doi: 10.3389/frobt.2021.661354
– ident: ref38
  doi: 10.1109/TNSRE.2020.3038209
– ident: ref2
  doi: 10.1109/TNSRE.2020.2984717
– ident: ref25
  doi: 10.1088/1741-2552/ab405f
– ident: ref4
  doi: 10.1109/LRA.2020.3005629
– ident: ref12
  doi: 10.1109/ACCESS.2020.2999133
– ident: ref48
  doi: 10.1016/j.compbiomed.2016.08.010
– year: 2018
  ident: ref42
  article-title: YOLOv3: An incremental improvement
  publication-title: arXiv 1804 02767
– ident: ref20
  doi: 10.1016/j.neunet.2019.07.008
– year: 2020
  ident: ref26
  article-title: Deep feature mining via attention-based BiLSTM-GCN for human motor imagery recognition
  publication-title: arXiv 2005 00777
– ident: ref6
  doi: 10.1109/LRA.2020.2970948
– ident: ref9
  doi: 10.1109/ICMRA51221.2020.9398375
– ident: ref47
  doi: 10.1080/10447318.2018.1445068
– ident: ref5
  doi: 10.1109/TMRB.2021.3086016
– ident: ref30
  doi: 10.1162/neco.1997.9.8.1735
– ident: ref8
  doi: 10.1109/TNSRE.2017.2694553
– ident: ref37
  doi: 10.1109/TNSRE.2020.3007625
– ident: ref23
  doi: 10.1007/s40846-020-00538-3
– ident: ref49
  doi: 10.1016/j.ergon.2011.03.005
– ident: ref18
  doi: 10.1016/j.ijleo.2016.10.117
– ident: ref50
  doi: 10.1145/3349341.3349414
– ident: ref35
  doi: 10.1109/SMC.2018.00180
– ident: ref45
  doi: 10.1109/CISP.2010.5648081
– ident: ref27
  doi: 10.1109/TIM.2020.3047502
– ident: ref43
  doi: 10.1016/j.bspc.2020.102172
– ident: ref51
  doi: 10.1007/978-3-030-58589-1_21
– ident: ref3
  doi: 10.1109/TRO.2016.2520486
– ident: ref14
  doi: 10.1016/j.compbiomed.2020.103843
– year: 2020
  ident: ref41
  article-title: GCNs-Net: A graph convolutional neural network approach for decoding time-resolved EEG motor imagery signals
  publication-title: arXiv 2006 08924
– ident: ref19
  doi: 10.1109/TNNLS.2020.3015505
– ident: ref33
  doi: 10.1109/TNSRE.2019.2914916
– ident: ref29
  doi: 10.1016/S1388-2457(99)00141-8
– ident: ref1
  doi: 10.1109/TRO.2015.2506731
– start-page: 511
  year: 2021
  ident: ref44
  article-title: A tensor-based frequency features combination method for brain-computer interfaces
  publication-title: Proc Int Conf Cognit Syst Signal Process
– ident: ref52
  doi: 10.1007/s11263-020-01401-3
– ident: ref21
  doi: 10.3389/fnins.2018.00680
SSID ssj0017657
Score 2.5174806
Snippet Motor disorder of upper limbs has seriously affected the daily life of the patients with hemiplegia after stroke. We developed a wearable supernumerary robotic...
SourceID doaj
proquest
pubmed
crossref
ieee
SourceType Open Website
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 1298
SubjectTerms Control systems
EEG
Electroencephalography
Feature extraction
Grasping
Hardware
Hemiplegia
Human engineering
human-robot interaction
Hybrid control
Hybrid systems
Manipulators
Mental task performance
motion imagery
Motion perception
Motor task performance
Object detection
Object recognition
Robots
Subsystems
Supernumerary
Supernumerary robotic limb
Task analysis
upper limb assistance
Wearable technology
SummonAdditionalLinks – databaseName: IEEE Electronic Library (IEL)
  dbid: RIE
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwELbanrjwKo9AQYMEXCBbO_E6ybEtrRbEFmm7Fb1FfkWqoEm1JEjLr2fGyUYFAeIWJY7z8Iz9ffbMZ8ZeJt7muRAqzhNtYplxGRspRJykxVQqneVSUnLy_FTNzuWHi-nFFns75sJ470PwmZ_QYVjLd43taKpsH7EKTWNus20kbn2u1rhikKmg6okOLGOZJnyTIMOL_eXp2eIYqWCSIENFMJnRZjw4ziK2Jtx7YzwKsv3DPit_h5xh6Dm5w-abl-4jTr5MutZM7I_f9Bz_96vustsDBoWD3mjusS1f32evbuoNw7IXG4DXsPhFynuXff-MvkH5VnDWXftVTfmberWGRWMarA8-Xl4Z6HXQIcQjgIbZmvLC4KgPi4eDQcccDnEIddDUMG-Q-sP7KxLUWIOuHXwyNEEE73wbYsXqB-z85Hh5NIuHzRtiK5VoY5Hr1BrhBbeFtMj6cq9cpbSRxmmEmVrmxmopNBfOcofMyPJKWp0mLq1yo9KHbKduav-YgcNuolJ40WVaOoMcL6fkMawY4azTNmJi04SlHX4HbbDxtQwMhxdlsICSLKAcLCBib8Z7rntdj3-WPiTLGEuSJnc4gQ1ZDi5eWuTKqVK-8mkmiywxGvtKJackn6O8zCO2S40_VjK0e8T2NnZWDv3HtzJRGS0Rp5JH7MV4GT2flnN07ZuOyhCZ5mkyjdij3j7HujfW_eTPz3zKbtHn9VNJe2ynXXX-GYKr1jwPXvUTUZcc6A
  priority: 102
  providerName: IEEE
Title Wearable Supernumerary Robotic Limb System Using a Hybrid Control Approach Based on Motor Imagery and Object Detection
URI https://ieeexplore.ieee.org/document/9770052
https://www.ncbi.nlm.nih.gov/pubmed/35511846
https://www.proquest.com/docview/2670203340
https://www.proquest.com/docview/2660100325
https://doaj.org/article/c840366efe374972ba22564552836e48
Volume 30
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Nb9QwELVQT1wQUD4CpRok4IKixsnETo5tabUgWqTtVvQW-StSJZpUyy7S_ntmnGzUHoAL19ixLM8beZ7teSPEuzy4qpJSpVVubIo6w9SilGle1CUqoytETk4-O1ezS_xyVV7dKfXFb8IGeeBh4Q4cMZBCqdCGQmOtc2sIgQpLFiVRAWOaL-15WzI13h9oVeptikxWHyzOL-YnRAbznDgqhZMa721DUa1_LK_y50gz7jinj8WjMVSEw2GKT8SD0D0V7-_KAsNi0ASADzC_p7i9K359JwhzWhRcrG_DsuM0S7PcwLy3PY0HX69vLAxy5RCfDYCB2YbTt-B4eL0Oh6PcOBzRTueh7-CsJ4YOn29Y92IDpvPwzfI5DnwKq_ikq3smLk9PFsezdKyxkDpUcpXKyhTOyiAzV6MjclYF5VtlLFpvKBo0WFlnUJpMepd5IjAua9GZIvdFW1lVPBc7Xd-FlwI8eXOrqNFrg94SFas4x4sGpqjTG5cIuV3yxo3LwXUwfjSRiGR1E83UsJma0UyJ-Dj9czvIb_y19xFbcurJ0tnxAwGqGQHV_AtQidhlHEyDUIDMZ-eJ2Nviohnd_GeTK803uQVmiXg7NZOD8q2L6UK_5j7MebMiLxPxYsDTNDYFe0TwUL36HxN_LR7yYgznQ3tiZ7VchzcUMa3sfnSO_Zjc-Buj3w9D
linkProvider Directory of Open Access Journals
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV1Lb9NAEF6VcoALr_IwFFgkygU59a43682BQ59KaBKkNBW9mX1FQlC7Sh1Q-C38Ff4bM2vHKgi4VeIW2euJs5mZ_WZ25ltCXnJvlWJMxoprE4ssEbERjMU87XWF1JkSApuTR2PZPxFvT7una-R72wvjvQ_FZ76DH8NevivtAlNl24BVMI3ZlFAe-eVXCNAu3gz24d_c4vzwYLrXj5szBGIrJKtipnRqDfMssT1hIfhQXrqZ1EYYpwHtaKGM1YLphDmbOADoNpkJq1Pu0pkyMgW518h1wBldXneHtXsUmQw8ouAyRCxSnqxacpLe9nR8PDmA4JNziIkBvmZ4_A-s7IDmEWlfWgHDQQHNyS5_B7lhsTu8TX6spqmucfnUWVSmY7_9xiD5v87jHXKrQdl0pzaLu2TNF_fI1mVGZTqt6RToKzr5hax8g3x5Dy-MHWX0eHHu5wV2qOr5kk5KU4I8Ovx4ZmjN9E5DxQXVtL_Ezje6Vxf-052GqZ3uAkhwtCzoqKzKOR2cIWXIkurC0XcGU2B031ehGq64T06uZEoekPWiLPwjQh04wpmEmy7TwhmIYhW2x4FgAOxO24iwlcrktpkOPELkcx5iuKSXB43LUePyRuMi8rp95rxmLvnn6F3UxHYkso6HC6A4eePEcqsEAB7pZz7NRC_jRsNqINE0VCq9UBHZQGVrhTR6FpHNlV7njYe8yLnMcBM8FUlEXrS3wbfhhpUufLnAMZguSFLejcjD2h5a2Strevzn73xObvSno2E-HIyPnpCb-FPrxNkmWa_mC_8UoGRlngWLpuTDVav-T3C8fCk
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Wearable+Supernumerary+Robotic+Limb+System+Using+a+Hybrid+Control+Approach+Based+on+Motor+Imagery+and+Object+Detection&rft.jtitle=IEEE+transactions+on+neural+systems+and+rehabilitation+engineering&rft.au=Tang%2C+Zhichuan&rft.au=Zhang%2C+Lingtao&rft.au=Chen%2C+Xin&rft.au=Ying%2C+Jichen&rft.date=2022&rft.eissn=1558-0210&rft.volume=30&rft.spage=1298&rft_id=info:doi/10.1109%2FTNSRE.2022.3172974&rft_id=info%3Apmid%2F35511846&rft.externalDocID=35511846
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1534-4320&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1534-4320&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1534-4320&client=summon