Brain–machine interface based on deep learning to control asynchronously a lower-limb robotic exoskeleton: a case-of-study

This research focused on the development of a motor imagery (MI) based brain-machine interface (BMI) using deep learning algorithms to control a lower-limb robotic exoskeleton. The study aimed to overcome the limitations of traditional BMI approaches by leveraging the advantages of deep learning, su...

Full description

Saved in:
Bibliographic Details
Published inJournal of neuroengineering and rehabilitation Vol. 21; no. 1; pp. 48 - 14
Main Authors Ferrero, Laura, Soriano-Segura, Paula, Navarro, Jacobo, Jones, Oscar, Ortiz, Mario, Iáñez, Eduardo, Azorín, José M., Contreras-Vidal, José L.
Format Journal Article
LanguageEnglish
Published England BioMed Central Ltd 05.04.2024
BioMed Central
BMC
Subjects
Online AccessGet full text

Cover

Loading…
Abstract This research focused on the development of a motor imagery (MI) based brain-machine interface (BMI) using deep learning algorithms to control a lower-limb robotic exoskeleton. The study aimed to overcome the limitations of traditional BMI approaches by leveraging the advantages of deep learning, such as automated feature extraction and transfer learning. The experimental protocol to evaluate the BMI was designed as asynchronous, allowing subjects to perform mental tasks at their own will. A total of five healthy able-bodied subjects were enrolled in this study to participate in a series of experimental sessions. The brain signals from two of these sessions were used to develop a generic deep learning model through transfer learning. Subsequently, this model was fine-tuned during the remaining sessions and subjected to evaluation. Three distinct deep learning approaches were compared: one that did not undergo fine-tuning, another that fine-tuned all layers of the model, and a third one that fine-tuned only the last three layers. The evaluation phase involved the exclusive closed-loop control of the exoskeleton device by the participants' neural activity using the second deep learning approach for the decoding. The three deep learning approaches were assessed in comparison to an approach based on spatial features that was trained for each subject and experimental session, demonstrating their superior performance. Interestingly, the deep learning approach without fine-tuning achieved comparable performance to the features-based approach, indicating that a generic model trained on data from different individuals and previous sessions can yield similar efficacy. Among the three deep learning approaches compared, fine-tuning all layer weights demonstrated the highest performance. This research represents an initial stride toward future calibration-free methods. Despite the efforts to diminish calibration time by leveraging data from other subjects, complete elimination proved unattainable. The study's discoveries hold notable significance for advancing calibration-free approaches, offering the promise of minimizing the need for training trials. Furthermore, the experimental evaluation protocol employed in this study aimed to replicate real-life scenarios, granting participants a higher degree of autonomy in decision-making regarding actions such as walking or stopping gait.
AbstractList This research focused on the development of a motor imagery (MI) based brain-machine interface (BMI) using deep learning algorithms to control a lower-limb robotic exoskeleton. The study aimed to overcome the limitations of traditional BMI approaches by leveraging the advantages of deep learning, such as automated feature extraction and transfer learning. The experimental protocol to evaluate the BMI was designed as asynchronous, allowing subjects to perform mental tasks at their own will. A total of five healthy able-bodied subjects were enrolled in this study to participate in a series of experimental sessions. The brain signals from two of these sessions were used to develop a generic deep learning model through transfer learning. Subsequently, this model was fine-tuned during the remaining sessions and subjected to evaluation. Three distinct deep learning approaches were compared: one that did not undergo fine-tuning, another that fine-tuned all layers of the model, and a third one that fine-tuned only the last three layers. The evaluation phase involved the exclusive closed-loop control of the exoskeleton device by the participants' neural activity using the second deep learning approach for the decoding. The three deep learning approaches were assessed in comparison to an approach based on spatial features that was trained for each subject and experimental session, demonstrating their superior performance. Interestingly, the deep learning approach without fine-tuning achieved comparable performance to the features-based approach, indicating that a generic model trained on data from different individuals and previous sessions can yield similar efficacy. Among the three deep learning approaches compared, fine-tuning all layer weights demonstrated the highest performance. This research represents an initial stride toward future calibration-free methods. Despite the efforts to diminish calibration time by leveraging data from other subjects, complete elimination proved unattainable. The study's discoveries hold notable significance for advancing calibration-free approaches, offering the promise of minimizing the need for training trials. Furthermore, the experimental evaluation protocol employed in this study aimed to replicate real-life scenarios, granting participants a higher degree of autonomy in decision-making regarding actions such as walking or stopping gait.
This research focused on the development of a motor imagery (MI) based brain-machine interface (BMI) using deep learning algorithms to control a lower-limb robotic exoskeleton. The study aimed to overcome the limitations of traditional BMI approaches by leveraging the advantages of deep learning, such as automated feature extraction and transfer learning. The experimental protocol to evaluate the BMI was designed as asynchronous, allowing subjects to perform mental tasks at their own will.BACKGROUNDThis research focused on the development of a motor imagery (MI) based brain-machine interface (BMI) using deep learning algorithms to control a lower-limb robotic exoskeleton. The study aimed to overcome the limitations of traditional BMI approaches by leveraging the advantages of deep learning, such as automated feature extraction and transfer learning. The experimental protocol to evaluate the BMI was designed as asynchronous, allowing subjects to perform mental tasks at their own will.A total of five healthy able-bodied subjects were enrolled in this study to participate in a series of experimental sessions. The brain signals from two of these sessions were used to develop a generic deep learning model through transfer learning. Subsequently, this model was fine-tuned during the remaining sessions and subjected to evaluation. Three distinct deep learning approaches were compared: one that did not undergo fine-tuning, another that fine-tuned all layers of the model, and a third one that fine-tuned only the last three layers. The evaluation phase involved the exclusive closed-loop control of the exoskeleton device by the participants' neural activity using the second deep learning approach for the decoding.METHODSA total of five healthy able-bodied subjects were enrolled in this study to participate in a series of experimental sessions. The brain signals from two of these sessions were used to develop a generic deep learning model through transfer learning. Subsequently, this model was fine-tuned during the remaining sessions and subjected to evaluation. Three distinct deep learning approaches were compared: one that did not undergo fine-tuning, another that fine-tuned all layers of the model, and a third one that fine-tuned only the last three layers. The evaluation phase involved the exclusive closed-loop control of the exoskeleton device by the participants' neural activity using the second deep learning approach for the decoding.The three deep learning approaches were assessed in comparison to an approach based on spatial features that was trained for each subject and experimental session, demonstrating their superior performance. Interestingly, the deep learning approach without fine-tuning achieved comparable performance to the features-based approach, indicating that a generic model trained on data from different individuals and previous sessions can yield similar efficacy. Among the three deep learning approaches compared, fine-tuning all layer weights demonstrated the highest performance.RESULTSThe three deep learning approaches were assessed in comparison to an approach based on spatial features that was trained for each subject and experimental session, demonstrating their superior performance. Interestingly, the deep learning approach without fine-tuning achieved comparable performance to the features-based approach, indicating that a generic model trained on data from different individuals and previous sessions can yield similar efficacy. Among the three deep learning approaches compared, fine-tuning all layer weights demonstrated the highest performance.This research represents an initial stride toward future calibration-free methods. Despite the efforts to diminish calibration time by leveraging data from other subjects, complete elimination proved unattainable. The study's discoveries hold notable significance for advancing calibration-free approaches, offering the promise of minimizing the need for training trials. Furthermore, the experimental evaluation protocol employed in this study aimed to replicate real-life scenarios, granting participants a higher degree of autonomy in decision-making regarding actions such as walking or stopping gait.CONCLUSIONThis research represents an initial stride toward future calibration-free methods. Despite the efforts to diminish calibration time by leveraging data from other subjects, complete elimination proved unattainable. The study's discoveries hold notable significance for advancing calibration-free approaches, offering the promise of minimizing the need for training trials. Furthermore, the experimental evaluation protocol employed in this study aimed to replicate real-life scenarios, granting participants a higher degree of autonomy in decision-making regarding actions such as walking or stopping gait.
Background This research focused on the development of a motor imagery (MI) based brain-machine interface (BMI) using deep learning algorithms to control a lower-limb robotic exoskeleton. The study aimed to overcome the limitations of traditional BMI approaches by leveraging the advantages of deep learning, such as automated feature extraction and transfer learning. The experimental protocol to evaluate the BMI was designed as asynchronous, allowing subjects to perform mental tasks at their own will. Methods A total of five healthy able-bodied subjects were enrolled in this study to participate in a series of experimental sessions. The brain signals from two of these sessions were used to develop a generic deep learning model through transfer learning. Subsequently, this model was fine-tuned during the remaining sessions and subjected to evaluation. Three distinct deep learning approaches were compared: one that did not undergo fine-tuning, another that fine-tuned all layers of the model, and a third one that fine-tuned only the last three layers. The evaluation phase involved the exclusive closed-loop control of the exoskeleton device by the participants' neural activity using the second deep learning approach for the decoding. Results The three deep learning approaches were assessed in comparison to an approach based on spatial features that was trained for each subject and experimental session, demonstrating their superior performance. Interestingly, the deep learning approach without fine-tuning achieved comparable performance to the features-based approach, indicating that a generic model trained on data from different individuals and previous sessions can yield similar efficacy. Among the three deep learning approaches compared, fine-tuning all layer weights demonstrated the highest performance. Conclusion This research represents an initial stride toward future calibration-free methods. Despite the efforts to diminish calibration time by leveraging data from other subjects, complete elimination proved unattainable. The study's discoveries hold notable significance for advancing calibration-free approaches, offering the promise of minimizing the need for training trials. Furthermore, the experimental evaluation protocol employed in this study aimed to replicate real-life scenarios, granting participants a higher degree of autonomy in decision-making regarding actions such as walking or stopping gait. Keywords: Brain-machine interface, EEG, Exoskeleton, Deep learning, Transfer learning
Abstract Background This research focused on the development of a motor imagery (MI) based brain–machine interface (BMI) using deep learning algorithms to control a lower-limb robotic exoskeleton. The study aimed to overcome the limitations of traditional BMI approaches by leveraging the advantages of deep learning, such as automated feature extraction and transfer learning. The experimental protocol to evaluate the BMI was designed as asynchronous, allowing subjects to perform mental tasks at their own will. Methods A total of five healthy able-bodied subjects were enrolled in this study to participate in a series of experimental sessions. The brain signals from two of these sessions were used to develop a generic deep learning model through transfer learning. Subsequently, this model was fine-tuned during the remaining sessions and subjected to evaluation. Three distinct deep learning approaches were compared: one that did not undergo fine-tuning, another that fine-tuned all layers of the model, and a third one that fine-tuned only the last three layers. The evaluation phase involved the exclusive closed-loop control of the exoskeleton device by the participants’ neural activity using the second deep learning approach for the decoding. Results The three deep learning approaches were assessed in comparison to an approach based on spatial features that was trained for each subject and experimental session, demonstrating their superior performance. Interestingly, the deep learning approach without fine-tuning achieved comparable performance to the features-based approach, indicating that a generic model trained on data from different individuals and previous sessions can yield similar efficacy. Among the three deep learning approaches compared, fine-tuning all layer weights demonstrated the highest performance. Conclusion This research represents an initial stride toward future calibration-free methods. Despite the efforts to diminish calibration time by leveraging data from other subjects, complete elimination proved unattainable. The study’s discoveries hold notable significance for advancing calibration-free approaches, offering the promise of minimizing the need for training trials. Furthermore, the experimental evaluation protocol employed in this study aimed to replicate real-life scenarios, granting participants a higher degree of autonomy in decision-making regarding actions such as walking or stopping gait.
This research focused on the development of a motor imagery (MI) based brain-machine interface (BMI) using deep learning algorithms to control a lower-limb robotic exoskeleton. The study aimed to overcome the limitations of traditional BMI approaches by leveraging the advantages of deep learning, such as automated feature extraction and transfer learning. The experimental protocol to evaluate the BMI was designed as asynchronous, allowing subjects to perform mental tasks at their own will. A total of five healthy able-bodied subjects were enrolled in this study to participate in a series of experimental sessions. The brain signals from two of these sessions were used to develop a generic deep learning model through transfer learning. Subsequently, this model was fine-tuned during the remaining sessions and subjected to evaluation. Three distinct deep learning approaches were compared: one that did not undergo fine-tuning, another that fine-tuned all layers of the model, and a third one that fine-tuned only the last three layers. The evaluation phase involved the exclusive closed-loop control of the exoskeleton device by the participants' neural activity using the second deep learning approach for the decoding. The three deep learning approaches were assessed in comparison to an approach based on spatial features that was trained for each subject and experimental session, demonstrating their superior performance. Interestingly, the deep learning approach without fine-tuning achieved comparable performance to the features-based approach, indicating that a generic model trained on data from different individuals and previous sessions can yield similar efficacy. Among the three deep learning approaches compared, fine-tuning all layer weights demonstrated the highest performance. This research represents an initial stride toward future calibration-free methods. Despite the efforts to diminish calibration time by leveraging data from other subjects, complete elimination proved unattainable. The study's discoveries hold notable significance for advancing calibration-free approaches, offering the promise of minimizing the need for training trials. Furthermore, the experimental evaluation protocol employed in this study aimed to replicate real-life scenarios, granting participants a higher degree of autonomy in decision-making regarding actions such as walking or stopping gait.
BackgroundThis research focused on the development of a motor imagery (MI) based brain–machine interface (BMI) using deep learning algorithms to control a lower-limb robotic exoskeleton. The study aimed to overcome the limitations of traditional BMI approaches by leveraging the advantages of deep learning, such as automated feature extraction and transfer learning. The experimental protocol to evaluate the BMI was designed as asynchronous, allowing subjects to perform mental tasks at their own will.MethodsA total of five healthy able-bodied subjects were enrolled in this study to participate in a series of experimental sessions. The brain signals from two of these sessions were used to develop a generic deep learning model through transfer learning. Subsequently, this model was fine-tuned during the remaining sessions and subjected to evaluation. Three distinct deep learning approaches were compared: one that did not undergo fine-tuning, another that fine-tuned all layers of the model, and a third one that fine-tuned only the last three layers. The evaluation phase involved the exclusive closed-loop control of the exoskeleton device by the participants’ neural activity using the second deep learning approach for the decoding.ResultsThe three deep learning approaches were assessed in comparison to an approach based on spatial features that was trained for each subject and experimental session, demonstrating their superior performance. Interestingly, the deep learning approach without fine-tuning achieved comparable performance to the features-based approach, indicating that a generic model trained on data from different individuals and previous sessions can yield similar efficacy. Among the three deep learning approaches compared, fine-tuning all layer weights demonstrated the highest performance.ConclusionThis research represents an initial stride toward future calibration-free methods. Despite the efforts to diminish calibration time by leveraging data from other subjects, complete elimination proved unattainable. The study’s discoveries hold notable significance for advancing calibration-free approaches, offering the promise of minimizing the need for training trials. Furthermore, the experimental evaluation protocol employed in this study aimed to replicate real-life scenarios, granting participants a higher degree of autonomy in decision-making regarding actions such as walking or stopping gait.
ArticleNumber 48
Audience Academic
Author Jones, Oscar
Ferrero, Laura
Navarro, Jacobo
Ortiz, Mario
Iáñez, Eduardo
Soriano-Segura, Paula
Contreras-Vidal, José L.
Azorín, José M.
Author_xml – sequence: 1
  givenname: Laura
  orcidid: 0000-0003-2256-757X
  surname: Ferrero
  fullname: Ferrero, Laura
– sequence: 2
  givenname: Paula
  surname: Soriano-Segura
  fullname: Soriano-Segura, Paula
– sequence: 3
  givenname: Jacobo
  surname: Navarro
  fullname: Navarro, Jacobo
– sequence: 4
  givenname: Oscar
  surname: Jones
  fullname: Jones, Oscar
– sequence: 5
  givenname: Mario
  orcidid: 0000-0002-4269-1554
  surname: Ortiz
  fullname: Ortiz, Mario
– sequence: 6
  givenname: Eduardo
  orcidid: 0000-0001-8057-5952
  surname: Iáñez
  fullname: Iáñez, Eduardo
– sequence: 7
  givenname: José M.
  orcidid: 0000-0001-5548-9657
  surname: Azorín
  fullname: Azorín, José M.
– sequence: 8
  givenname: José L.
  orcidid: 0000-0002-6499-1208
  surname: Contreras-Vidal
  fullname: Contreras-Vidal, José L.
BackLink https://www.ncbi.nlm.nih.gov/pubmed/38581031$$D View this record in MEDLINE/PubMed
BookMark eNp9ksFu1DAQhiNURNuFF-CALHHhkmLHSRxzK1WBSpW4wNma2OOtF8de7KxgJQ68Q9-QJ8HbLQUqhCzL1uib356Z_7g6CDFgVT1l9ISxoX-ZWSOHtqZN2Yy3TS0fVEdMtLymlPKDP-6H1XHOq3Jpadc-qg750A2McnZUfXudwIUf368n0FcuIHFhxmRBIxkhoyExEIO4Jh4hBReWZI5ExzCn6AnkbdBXKYa4yX5LgPj4BVPt3TSSFMc4O03wa8yf0OMcw6tC6CJaR1vneWO2j6uHFnzGJ7fnovr45vzD2bv68v3bi7PTy1p3dJhryamQhhlpW4lCI0UhxchbLXSjJRu17anlIIWxHLUZ-m7XGDaCGRuQzPBFdbHXNRFWap3cBGmrIjh1E4hpqSCVz3pUaItM24l-oGMrWwtGaoEUEKHr-54XrRd7rXWKnzeYZzW5rNF7CFjaoHhpctNyJmlBn99DV3GTQql0R4lB8EH2v6kllPddsHFOoHei6lQMkkrelekuqpN_UGUZnFyZB1pX4n8lPLt9fDNOaO6q_jX6AjR7QKeYc0J7hzCqdv5Se3-p4i914y8lS9JwL0m7GWa3MwQ4_7_Un9l21JA
CitedBy_id crossref_primary_10_3390_signals5030034
crossref_primary_10_1115_1_4066859
crossref_primary_10_1109_TBME_2024_3440036
crossref_primary_10_1109_TIM_2024_3481538
Cites_doi 10.1038/s41598-020-60932-4
10.1088/1741-2552/aace8c
10.1016/j.neuroscience.2016.11.023
10.1007/BF01129656
10.3390/mi13060927
10.1016/S1388-2457(99)00141-8
10.3389/fnhum.2018.00312
10.1007/978-3-642-15995-4_78
10.3389/fnins.2021.774857
10.3390/s20247309
10.1016/j.msksp.2020.102313
10.1016/j.nicl.2020.102502
10.1177/15459683221138751
10.1109/EMBC40787.2023.10340008
10.3389/fninf.2018.00078
10.3390/s21196431
10.1007/s11571-021-09676-z
10.1002/hbm.23730
10.1088/1741-2552/aab2f2
10.3389/fnins.2016.00456
10.1109/ACCESS.2020.2991812
10.1038/s41598-017-07823-3
10.1016/j.compbiomed.2022.105242
10.1016/j.isci.2023.106675
10.1152/jn.90989.2008
10.1109/EMBC44109.2020.9175929
10.1109/EMBC48229.2022.9871590
10.1016/j.jneumeth.2022.109736
10.1088/1741-2552/abda0c
10.1016/j.neuroimage.2005.12.003
10.3390/mi13091485
10.1123/jsep.34.5.621
10.3390/app11094106
10.1109/EMBC.2018.8512256
10.1109/EMBC46164.2021.9630155
10.1371/journal.pone.0268880
10.1088/1741-2552/aaa8c0
10.3389/fbioe.2020.00735
10.1088/1741-2552/ab0ab5
10.1109/EMBC.2019.8857575
10.1038/s41598-019-46310-9
10.1109/EMBC40787.2023.10340275
10.3390/s19061423
10.1088/1741-2560/13/2/026013
ContentType Journal Article
Copyright 2024. The Author(s).
COPYRIGHT 2024 BioMed Central Ltd.
2024. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright_xml – notice: 2024. The Author(s).
– notice: COPYRIGHT 2024 BioMed Central Ltd.
– notice: 2024. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
DBID AAYXX
CITATION
NPM
3V.
7QO
7RV
7TB
7TK
7TS
7X7
7XB
88C
88E
8FD
8FE
8FG
8FH
8FI
8FJ
8FK
ABJCF
ABUWG
AFKRA
AZQEC
BBNVY
BENPR
BGLVJ
BHPHI
CCPQU
DWQXO
FR3
FYUFA
GHDGH
GNUQQ
HCIFZ
K9.
KB0
L6V
LK8
M0S
M0T
M1P
M7P
M7S
NAPCQ
P64
PHGZM
PHGZT
PIMPY
PJZUB
PKEHL
PPXIY
PQEST
PQGLB
PQQKQ
PQUKI
PTHSS
7X8
DOA
DOI 10.1186/s12984-024-01342-9
DatabaseName CrossRef
PubMed
ProQuest Central (Corporate)
Biotechnology Research Abstracts
Nursing & Allied Health Database
Mechanical & Transportation Engineering Abstracts
Neurosciences Abstracts
Physical Education Index
ProQuest Health & Medical Collection
ProQuest Central (purchase pre-March 2016)
Healthcare Administration Database (Alumni)
Medical Database (Alumni Edition)
Technology Research Database
ProQuest SciTech Collection
ProQuest Technology Collection
ProQuest Natural Science Collection
Hospital Premium Collection
Hospital Premium Collection (Alumni Edition)
ProQuest Central (Alumni) (purchase pre-March 2016)
Materials Science & Engineering Collection
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials
Biological Science Collection
ProQuest Central
Technology Collection
Natural Science Collection
ProQuest One Community College
ProQuest Central Korea
Engineering Research Database
Health Research Premium Collection
Health Research Premium Collection (Alumni)
ProQuest Central Student
SciTech Premium Collection
ProQuest Health & Medical Complete (Alumni)
Nursing & Allied Health Database (Alumni Edition)
ProQuest Engineering Collection
ProQuest Biological Science Collection
Health & Medical Collection (Alumni Edition)
Healthcare Administration Database (Proquest)
Medical Database
Biological Science Database
Engineering Database
Nursing & Allied Health Premium
Biotechnology and BioEngineering Abstracts
ProQuest Central Premium
ProQuest One Academic
Publicly Available Content Database
ProQuest Health & Medical Research Collection
ProQuest One Academic Middle East (New)
ProQuest One Health & Nursing
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic
ProQuest One Academic UKI Edition
Engineering Collection
MEDLINE - Academic
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
PubMed
Publicly Available Content Database
ProQuest Central Student
ProQuest Central Essentials
SciTech Premium Collection
ProQuest One Applied & Life Sciences
Health Research Premium Collection
Natural Science Collection
Health & Medical Research Collection
Biological Science Collection
ProQuest Central (New)
ProQuest Medical Library (Alumni)
Engineering Collection
Engineering Database
ProQuest Biological Science Collection
ProQuest One Academic Eastern Edition
ProQuest Hospital Collection
ProQuest Technology Collection
Health Research Premium Collection (Alumni)
Biological Science Database
Neurosciences Abstracts
ProQuest Hospital Collection (Alumni)
Biotechnology and BioEngineering Abstracts
Nursing & Allied Health Premium
ProQuest Health & Medical Complete
ProQuest One Academic UKI Edition
ProQuest Health Management (Alumni Edition)
ProQuest Nursing & Allied Health Source (Alumni)
Engineering Research Database
ProQuest One Academic
ProQuest One Academic (New)
Technology Collection
Technology Research Database
ProQuest One Academic Middle East (New)
Mechanical & Transportation Engineering Abstracts
ProQuest Health & Medical Complete (Alumni)
ProQuest Central (Alumni Edition)
ProQuest One Community College
ProQuest One Health & Nursing
ProQuest Natural Science Collection
Physical Education Index
ProQuest Central
ProQuest Health & Medical Research Collection
ProQuest Engineering Collection
Biotechnology Research Abstracts
Health and Medicine Complete (Alumni Edition)
ProQuest Central Korea
ProQuest Health Management
ProQuest Nursing & Allied Health Source
ProQuest SciTech Collection
ProQuest Medical Library
Materials Science & Engineering Collection
ProQuest Central (Alumni)
MEDLINE - Academic
DatabaseTitleList
MEDLINE - Academic


PubMed
Publicly Available Content Database
Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 3
  dbid: 8FG
  name: ProQuest Technology Collection
  url: https://search.proquest.com/technologycollection1
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Medicine
Engineering
Occupational Therapy & Rehabilitation
Physical Therapy
EISSN 1743-0003
EndPage 14
ExternalDocumentID oai_doaj_org_article_ef97d457680b494fad9c7e0aeea56663
A789093501
38581031
10_1186_s12984_024_01342_9
Genre Journal Article
GeographicLocations Spain
Germany
GeographicLocations_xml – name: Spain
– name: Germany
GrantInformation_xml – fundername: Ministry of Science, Innovation and Universities through the Aid for the Training of University Teachers
  grantid: FPU19/03165
– fundername: MCIN/AEI/10.13039/501100011033 and by ERDF A way of making Europe
  grantid: PID2021-124111OB-C31
GroupedDBID ---
0R~
29L
2QV
2WC
53G
5GY
5VS
7RV
7X7
88E
8FE
8FG
8FH
8FI
8FJ
AAFWJ
AAJSJ
AASML
AAWTL
AAYXX
ABDBF
ABJCF
ABUWG
ACGFO
ACGFS
ACIWK
ACPRK
ACUHS
ADBBV
ADRAZ
ADUKV
AENEX
AFKRA
AFPKN
AFRAH
AHBYD
AHMBA
AHYZX
ALIPV
ALMA_UNASSIGNED_HOLDINGS
AMKLP
AMTXH
AOIJS
AQUVI
BAPOH
BAWUL
BBNVY
BCNDV
BENPR
BFQNJ
BGLVJ
BHPHI
BMC
BPHCQ
BVXVI
C6C
CCPQU
CITATION
CS3
DIK
DU5
E3Z
EBD
EBLON
EBS
ESX
F5P
FYUFA
GROUPED_DOAJ
GX1
HCIFZ
HMCUK
HYE
I-F
IAO
IHR
INH
INR
IPY
ITC
KQ8
L6V
LK8
M0T
M1P
M48
M7P
M7S
ML0
M~E
NAPCQ
O5R
O5S
OK1
OVT
P2P
PGMZT
PHGZM
PHGZT
PIMPY
PQQKQ
PROAC
PSQYO
PTHSS
RBZ
RNS
ROL
RPM
RSV
SBL
SOJ
TR2
TUS
UKHRP
WOQ
WOW
XSB
~8M
NPM
PJZUB
PPXIY
PQGLB
PMFND
3V.
7QO
7TB
7TK
7TS
7XB
8FD
8FK
AZQEC
DWQXO
FR3
GNUQQ
K9.
P64
PKEHL
PQEST
PQUKI
7X8
PUEGO
ID FETCH-LOGICAL-c508t-93079d1d9f49e7ce0e797b34c7c2c91bcf60f3a97df3ecd86512981badb2a91d3
IEDL.DBID M48
ISSN 1743-0003
IngestDate Wed Aug 27 01:30:13 EDT 2025
Fri Jul 11 08:16:23 EDT 2025
Fri Jul 25 19:12:16 EDT 2025
Tue Jun 17 22:10:11 EDT 2025
Tue Jun 10 21:08:29 EDT 2025
Mon Jul 21 05:50:48 EDT 2025
Thu Apr 24 23:12:32 EDT 2025
Tue Jul 01 02:20:02 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 1
Keywords Deep learning
Exoskeleton
Brain–machine interface
Transfer learning
EEG
Language English
License 2024. The Author(s).
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c508t-93079d1d9f49e7ce0e797b34c7c2c91bcf60f3a97df3ecd86512981badb2a91d3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0001-8057-5952
0000-0002-6499-1208
0000-0002-4269-1554
0000-0001-5548-9657
0000-0003-2256-757X
OpenAccessLink https://www.proquest.com/docview/3037873896?pq-origsite=%requestingapplication%
PMID 38581031
PQID 3037873896
PQPubID 55356
PageCount 14
ParticipantIDs doaj_primary_oai_doaj_org_article_ef97d457680b494fad9c7e0aeea56663
proquest_miscellaneous_3034243190
proquest_journals_3037873896
gale_infotracmisc_A789093501
gale_infotracacademiconefile_A789093501
pubmed_primary_38581031
crossref_primary_10_1186_s12984_024_01342_9
crossref_citationtrail_10_1186_s12984_024_01342_9
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2024-04-05
PublicationDateYYYYMMDD 2024-04-05
PublicationDate_xml – month: 04
  year: 2024
  text: 2024-04-05
  day: 05
PublicationDecade 2020
PublicationPlace England
PublicationPlace_xml – name: England
– name: London
PublicationTitle Journal of neuroengineering and rehabilitation
PublicationTitleAlternate J Neuroeng Rehabil
PublicationYear 2024
Publisher BioMed Central Ltd
BioMed Central
BMC
Publisher_xml – name: BioMed Central Ltd
– name: BioMed Central
– name: BMC
References G Pfurtscheller (1342_CR11) 2006; 31
N Tibrewal (1342_CR39) 2022; 17
SY Gordleeva (1342_CR43) 2020; 8
K Xie (1342_CR3) 2017; 7
N Padfield (1342_CR7) 2019; 19
RT Schirrmeister (1342_CR18) 2017; 38
S Trapero-Asenjo (1342_CR29) 2021; 51
S Nakagome (1342_CR8) 2020; 10
Z Chang (1342_CR22) 2022; 13
P Wierzgała (1342_CR44) 2018; 12
1342_CR41
JJ Bird (1342_CR21) 2021; 18
P Chholak (1342_CR28) 2019; 9
1342_CR40
L Xu (1342_CR4) 2021; 15
1342_CR25
A Kilicarslan (1342_CR33) 2016; 13
J Zhang (1342_CR23) 2022; 13
A Gharabaghi (1342_CR1) 2016; 10
C Ruffino (1342_CR31) 2017; 341
L Ferrero (1342_CR36) 2021
M Ortiz (1342_CR14) 2020; 8
ZJ Koles (1342_CR15) 1990; 2
L Ferrero (1342_CR37) 2023; 26
A Colucci (1342_CR5) 2022; 36
M Tariq (1342_CR12) 2018; 12
F Lotte (1342_CR26) 2018; 15
SE Williams (1342_CR30) 2012; 34
1342_CR19
G Pfurtscheller (1342_CR10) 1999; 110
1342_CR13
1342_CR35
1342_CR16
1342_CR38
Y He (1342_CR6) 2018; 15
J Choi (1342_CR27) 2020; 20
MT Sadiq (1342_CR24) 2022; 143
NA Bhagat (1342_CR9) 2020; 28
KA Ludwig (1342_CR34) 2009; 101
A Craik (1342_CR17) 2019; 16
Z Khademi (1342_CR2) 2023; 383
P Barria (1342_CR42) 2021; 21
JS Huang (1342_CR20) 2021; 15
VJ Lawhern (1342_CR32) 2018; 15
References_xml – volume: 10
  start-page: 4372
  issue: 1
  year: 2020
  ident: 1342_CR8
  publication-title: Sci Rep
  doi: 10.1038/s41598-020-60932-4
– volume: 15
  start-page: 56013
  issue: 5
  year: 2018
  ident: 1342_CR32
  publication-title: J Neural Eng
  doi: 10.1088/1741-2552/aace8c
– volume: 341
  start-page: 61
  year: 2017
  ident: 1342_CR31
  publication-title: Neuroscience
  doi: 10.1016/j.neuroscience.2016.11.023
– volume: 2
  start-page: 275
  issue: 4
  year: 1990
  ident: 1342_CR15
  publication-title: Brain Topogr
  doi: 10.1007/BF01129656
– volume: 13
  start-page: 927
  issue: 6
  year: 2022
  ident: 1342_CR22
  publication-title: Micromachines
  doi: 10.3390/mi13060927
– volume: 110
  start-page: 1842
  issue: 11
  year: 1999
  ident: 1342_CR10
  publication-title: Clin Neurophysiol
  doi: 10.1016/S1388-2457(99)00141-8
– volume: 12
  start-page: 312
  issue: August
  year: 2018
  ident: 1342_CR12
  publication-title: Front Hum Neurosci
  doi: 10.3389/fnhum.2018.00312
– ident: 1342_CR16
  doi: 10.1007/978-3-642-15995-4_78
– volume: 15
  start-page: 1
  issue: November
  year: 2021
  ident: 1342_CR20
  publication-title: Front Neurosci
  doi: 10.3389/fnins.2021.774857
– volume: 20
  start-page: 7309
  issue: 24
  year: 2020
  ident: 1342_CR27
  publication-title: Sensors
  doi: 10.3390/s20247309
– volume: 51
  issue: December 2020
  year: 2021
  ident: 1342_CR29
  publication-title: Musculoskelet Sci Pract
  doi: 10.1016/j.msksp.2020.102313
– volume: 28
  year: 2020
  ident: 1342_CR9
  publication-title: NeuroImage Clin
  doi: 10.1016/j.nicl.2020.102502
– volume: 36
  start-page: 747
  issue: 12
  year: 2022
  ident: 1342_CR5
  publication-title: Neurorehabil Neural Repair
  doi: 10.1177/15459683221138751
– ident: 1342_CR35
  doi: 10.1109/EMBC40787.2023.10340008
– volume: 12
  start-page: 78
  issue: 1
  year: 2018
  ident: 1342_CR44
  publication-title: Front Neuroinform
  doi: 10.3389/fninf.2018.00078
– volume: 21
  start-page: 6431
  issue: 19
  year: 2021
  ident: 1342_CR42
  publication-title: Sensors
  doi: 10.3390/s21196431
– volume: 15
  start-page: 569
  issue: 4
  year: 2021
  ident: 1342_CR4
  publication-title: Cogn Neurodyn
  doi: 10.1007/s11571-021-09676-z
– volume: 38
  start-page: 5391
  issue: 11
  year: 2017
  ident: 1342_CR18
  publication-title: Hum Brain Mapp
  doi: 10.1002/hbm.23730
– volume: 15
  issue: 3
  year: 2018
  ident: 1342_CR26
  publication-title: J Neural Eng
  doi: 10.1088/1741-2552/aab2f2
– volume: 10
  start-page: 456
  year: 2016
  ident: 1342_CR1
  publication-title: Front Neurosci
  doi: 10.3389/fnins.2016.00456
– volume: 8
  start-page: 84070
  year: 2020
  ident: 1342_CR43
  publication-title: IEEE Access
  doi: 10.1109/ACCESS.2020.2991812
– volume: 7
  start-page: 7808
  issue: 1
  year: 2017
  ident: 1342_CR3
  publication-title: Sci Rep
  doi: 10.1038/s41598-017-07823-3
– volume: 143
  issue: August 2021
  year: 2022
  ident: 1342_CR24
  publication-title: Comput Biol Med
  doi: 10.1016/j.compbiomed.2022.105242
– volume: 26
  issue: 5
  year: 2023
  ident: 1342_CR37
  publication-title: iScience
  doi: 10.1016/j.isci.2023.106675
– volume: 101
  start-page: 1679
  issue: 3
  year: 2009
  ident: 1342_CR34
  publication-title: J Neurophysiol
  doi: 10.1152/jn.90989.2008
– ident: 1342_CR41
  doi: 10.1109/EMBC44109.2020.9175929
– ident: 1342_CR38
  doi: 10.1109/EMBC48229.2022.9871590
– volume: 383
  issue: October 2022
  year: 2023
  ident: 1342_CR2
  publication-title: J Neurosci Methods
  doi: 10.1016/j.jneumeth.2022.109736
– volume: 18
  issue: 2
  year: 2021
  ident: 1342_CR21
  publication-title: J Neural Eng
  doi: 10.1088/1741-2552/abda0c
– volume: 31
  start-page: 153
  year: 2006
  ident: 1342_CR11
  publication-title: Neuroimage
  doi: 10.1016/j.neuroimage.2005.12.003
– volume: 13
  start-page: 1
  issue: 9
  year: 2022
  ident: 1342_CR23
  publication-title: Micromachines
  doi: 10.3390/mi13091485
– volume: 34
  start-page: 621
  issue: 5
  year: 2012
  ident: 1342_CR30
  publication-title: J Sport Exerc Psychol
  doi: 10.1123/jsep.34.5.621
– year: 2021
  ident: 1342_CR36
  publication-title: Appl Sci
  doi: 10.3390/app11094106
– ident: 1342_CR13
  doi: 10.1109/EMBC.2018.8512256
– ident: 1342_CR19
  doi: 10.1109/EMBC46164.2021.9630155
– volume: 17
  start-page: 0268880
  year: 2022
  ident: 1342_CR39
  publication-title: PLoS ONE
  doi: 10.1371/journal.pone.0268880
– volume: 15
  issue: 2
  year: 2018
  ident: 1342_CR6
  publication-title: J Neural Eng
  doi: 10.1088/1741-2552/aaa8c0
– volume: 8
  start-page: 735
  year: 2020
  ident: 1342_CR14
  publication-title: Front Bioeng Biotechnol
  doi: 10.3389/fbioe.2020.00735
– volume: 16
  year: 2019
  ident: 1342_CR17
  publication-title: J Neural Eng
  doi: 10.1088/1741-2552/ab0ab5
– ident: 1342_CR25
  doi: 10.1109/EMBC.2019.8857575
– volume: 9
  start-page: 1
  issue: 1
  year: 2019
  ident: 1342_CR28
  publication-title: Sci Rep
  doi: 10.1038/s41598-019-46310-9
– ident: 1342_CR40
  doi: 10.1109/EMBC40787.2023.10340275
– volume: 19
  start-page: 1
  issue: 6
  year: 2019
  ident: 1342_CR7
  publication-title: Sensors (Switzerland)
  doi: 10.3390/s19061423
– volume: 13
  issue: 2
  year: 2016
  ident: 1342_CR33
  publication-title: J Neural Eng
  doi: 10.1088/1741-2560/13/2/026013
SSID ssj0034054
Score 2.4240363
Snippet This research focused on the development of a motor imagery (MI) based brain-machine interface (BMI) using deep learning algorithms to control a lower-limb...
Background This research focused on the development of a motor imagery (MI) based brain-machine interface (BMI) using deep learning algorithms to control a...
BackgroundThis research focused on the development of a motor imagery (MI) based brain–machine interface (BMI) using deep learning algorithms to control a...
Abstract Background This research focused on the development of a motor imagery (MI) based brain–machine interface (BMI) using deep learning algorithms to...
SourceID doaj
proquest
gale
pubmed
crossref
SourceType Open Website
Aggregation Database
Index Database
Enrichment Source
StartPage 48
SubjectTerms Algorithms
Analysis
Brain
Brain–machine interface
Calibration
Closed loops
Cognitive tasks
Decision making
Deep learning
EEG
Electrodes
Electroencephalography
Exoskeleton
Exoskeletons
Experiments
Feature extraction
Feedback control
Health aspects
Machine learning
Man-machine interfaces
Mental task performance
Methods
Neural networks
Questionnaires
Rehabilitation
Robot control
Robotics
Transfer learning
User interface
Wavelet transforms
SummonAdditionalLinks – databaseName: DOAJ Directory of Open Access Journals
  dbid: DOA
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1baxUxEA7SB9EHL8fbapUIXh4kdC_ZS3xrxVKEikgLfQu5TEQ83S1nT6GFPvgf_If-Emeyu4dzFPRF2KfdyZJkJjNfSOYbxl6U6PRkA42ooS6ENMHjmmsKUZS29Cm4OsS8tcOP1cGx_HBSnqyV-qI7YQM98DBxOxBU7SWh4tRKJYPxytWQGgCDSKSKPJ8Y86bN1OCDC4QhckqRaaqdHqNaIwXGI9w6FzIXaiMMRbb-P33yb0gzRpz9O-zWCBX57tDFu-watDN2c41AcMauH45H4zP2cp0vmB8NZAH8Ff-8QcU9Y7c_jZqZZO6xqz2qE_Hz-4_TeLMSOHFILIJxwCnIed613AOc8bHExBe-7Ph4x52b_rJ1xLDbnffzS274nOquifnXU8sXne2w8xwuuv4bxjfEmW9RwuFPRRdEpLa9z4733x-9OxBjVQbhEMwthUKvoHzmVZAKagcp1Kq2hXS1y53KrAtVGgqDWgsFON9UBCkQHBtvc6MyXzxgW23XwiPGUydtMCa36K6lq3Jjib2wtEUGxjZlk7BsUpJ24zxR5Yy5jluXptKDYjUqVkfFapWwN6s2ZwNhx1-l90j3K0ki244v0AT1aIL6XyaYsNdkOZpcAnbPmTGzAQdJ5Fp6l5KNFZ3gJmx7QxKXstv8PNmeHl1JrxFjoFNFXFkl7PnqM7Wk63EtoHZJRuYIBVWasIeDza6GRCe_VMvj8f8Y6hN2I4-rCJ9ym20tF-fwFFHZ0j6LC_AXL8404g
  priority: 102
  providerName: Directory of Open Access Journals
– databaseName: ProQuest Health & Medical Collection
  dbid: 7X7
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1ba9RAFB60guhD1dVqtMoIXh5kaJKZXMYXacVShIpIC_s2zLWI22TdbKEFH_wP_kN_iefMzq5dhUKekpOQybl9kznzHUJeVBD0ROtb1viGM6GDA59rOeOVqVzubRPivrXDT_XBsfg4rsbph9uQyiqXMTEGatdb_Ee-A6EWbAvSa_1u-p1h1yhcXU0tNK6TG0hdhiVdzXg14eIARsRyo0xb7wyQ21rBICvBBJqLksm1ZBQ5-_-PzP_gzZh39u-SzQQY6e5Cw_fINd-NyO1LNIIjcvMwLZCPyMvLrMH0aEEZQF_RL2uE3CNy53PSz1LmPvmxh90ifv_8dRrrKz1FJolZ0NZTTHWO9h113k9pajRxQuc9TZXuVA8XnUWe3f5smFxQTSfYfY1Nvp4aOutNDy9P_Xk_fIMsB2jzLUhYeCjrA4sEtw_I8f6Ho_cHLPVmYBYg3ZxJiA3SFU4GIX1jfe4b2RgubGNLKwtjQ50HrmXjAvfWtTUCC4DI2plSy8LxLbLR9Z1_RGhuhQlalwaCtrB1qQ1yGFaGF16btmozUiyVpGz6Ttg_Y6LiBKat1UKxChSromKVzMib1T3TBW3HldJ7qPuVJFJuxxP97EQlD1Y-wGAETs9yI6QI2knb-Fx7rwES1zwjr9FyFAYGeD2r0_4GGCRSbKld3HIscR03I9trkuDQdv3y0vZUCiiD-mv-GXm-uox3YpFc50G7KCNKAIQyz8jDhc2uhoTrv9jR4_HVD39CbpXRP-CotsnGfHbmnwLqmptn0bX-APBuK9c
  priority: 102
  providerName: ProQuest
Title Brain–machine interface based on deep learning to control asynchronously a lower-limb robotic exoskeleton: a case-of-study
URI https://www.ncbi.nlm.nih.gov/pubmed/38581031
https://www.proquest.com/docview/3037873896
https://www.proquest.com/docview/3034243190
https://doaj.org/article/ef97d457680b494fad9c7e0aeea56663
Volume 21
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3ra9swEBd9wNg-7JG9vHVBgz0-DG2OJVvWYIxmNCuDlFIayDchyXIZS-0uSaHZX7872Q7NVsbA-IN9MpZPd_eTpfsdIa9ScHoi9zmTXnImTFmAzeWc8dSmReydLEPe2vgoO5yIb9N0ukW6ckftB1zcOLXDelKT-ez91c_VZzD4T8Hg8-zDAmJWLhhEG5gYc5EwtU12ITJJNNSxWK8qcAAnokmQ5JhNzbskmhufsRGoAp__3177DywaYtLoPrnbgkm632j_AdnyVY_cuUYx2CO3xu3ieY-8vs4oTE8bOgH6hp5skHX3yL3jVnedzEPya4iVJNh52HnpKXJMzEvjPMUgWNC6ooX3F7QtQXFGlzVt98BTs1hVDhl468vFbEUNnWFdNjb7fm7pvLY1vDr1V_XiB8Q_wKEfQcLBQ1ldskB9-4hMRgenXw5ZW7WBOQB7S6bAa6hiUKhSKC-dj71U0nLhpEucGlhXZnHJjZJFyb0r8gwhB4BnU9jEqEHBH5Odqq78U0JjJ2xpTGLBnQuXJcYiu2Fq-cAbm6d5RAadirRrvxJW1pjpMLXJM92oVYNadVCrVhF5t25z0RB6_FN6iJpfSyIZd7hQz890a9val9AZgRO32AolSlMoJ31svDcAljMekbc4bjQOYng9Z9rMB-gkkm_pfUxGVrjCG5G9DUkwdbd5uxt5urMUDRgEnC7gziwiL9e3sSVun6s8aBdlRAJQUcURedKM2HWXcGUYa308-4_Wz8ntJJgIHOke2VnOL_0LAGVL2yfbcirhnI--9snu8ODo-KQffnD0gw3-BjP_NUg
linkProvider Scholars Portal
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3bbhMxEB2VInF54BJuCwWMROkDsrrZ9V6MhFALVCltKoRSKW_G9norRLobklQQiQf-gf_go_gSZvYSGpD6VilP2bG1zlzOTGyfAXgaYdATqUt54pKQC51n6HNpyMPIRJnvbJJX99b6B3HvULwbRsMV-NXehaFjlW1MrAJ1Vlr6j3wTQy3aFsJr_Gr8hVPXKNpdbVto1Gax5-ZfsWSbvtx9g_pdD4Kdt4PXPd50FeAWk5EZl2jVMutmMhfSJdb5LpGJCYVNbGBl19g89vNQyyTLQ2ezNCZIxOROZybQspuFOO8FuIjA65NHJcNFgRdi8iPaizlpvDmlgYIjCmLBHoqAyyXwq3oE_I8E_-S3Fc7t3IBrTYLKtmqLugkrrujA1VO0hR241G825DuwfpqlmA1qigL2jH1YIgDvwPX3jT20Mrfg-zZ1p_j94-dxdZ7TMWKumOTaOkbQmrGyYJlzY9Y0tjhis5I1J-uZns4LS7y-5cl0NGeajajbGx99OjZsUpoSX565b-X0M6IqZrcvUMLipLzMeUWoexsOz0Vrd2C1KAt3D5hvhcm1DgyChLBxoA1xJkYm7Dpt0ij1oNsqSdnmd6J-HSNVFUxprGrFKlSsqhSrpAfPF2PGNU3ImdLbpPuFJFF8V1-UkyPVRAzlclyMoHLQN0KKXGfSJs7XzmlMwePQgw2yHEWBCF_P6uY-BS6SKL3UFl1xlrRv7MHakiQGELv8uLU91QSwqfrrbh48WTymkXQor3CoXZIRAfqB9D24W9vsYkm030wdRO6fPfljuNwb9PfV_u7B3gO4ElS-gp9oDVZnkxP3EDO-mXlUuRmDj-ft138ALq1pxg
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Brain-machine+interface+based+on+deep+learning+to+control+asynchronously+a+lower-limb+robotic+exoskeleton%3A+a+case-of-study&rft.jtitle=Journal+of+neuroengineering+and+rehabilitation&rft.au=Ferrero%2C+Laura&rft.au=Soriano-Segura%2C+Paula&rft.au=Navarro%2C+Jacobo&rft.au=Jones%2C+Oscar&rft.date=2024-04-05&rft.issn=1743-0003&rft.eissn=1743-0003&rft.volume=21&rft.issue=1&rft.spage=48&rft_id=info:doi/10.1186%2Fs12984-024-01342-9&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1743-0003&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1743-0003&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1743-0003&client=summon