EEG-based recognition of hand movement and its parameter

Objecitve . Brain–computer interface is a cutting-edge technology that enables interaction with external devices by decoding human intentions, and is highly valuable in the fields of medical rehabilitation and human-robot collaboration. The technique of decoding motor intent for motor execution (ME)...

Full description

Saved in:
Bibliographic Details
Published inJournal of neural engineering Vol. 22; no. 2; pp. 26006 - 26025
Main Authors Yan, Yuxuan, Li, Jianguang, Yin, Mingyue
Format Journal Article
LanguageEnglish
Published England IOP Publishing 06.03.2025
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Objecitve . Brain–computer interface is a cutting-edge technology that enables interaction with external devices by decoding human intentions, and is highly valuable in the fields of medical rehabilitation and human-robot collaboration. The technique of decoding motor intent for motor execution (ME) based on electroencephalographic (EEG) signals is in the feasibility study stage by now. There are still insufficient studies on the accuracy of ME EEG signal recognition in between-subjects classification to reach the level of realistic applications. This paper aims to investigate EEG signal-based hand movement recognition by analyzing low-frequency time-domain information. Approach . Experiments with four types of hand movements, two force parameter (picking up and pushing) tasks, and a four-target directional displacement task were designed and executed, and the EEG data from thirteen healthy volunteers was collected. Sliding window approach is used to expand the dataset in order to address the issue of EEG signal overfitting. Furtherly, Convolutional Neural Network (CNN)-Bidirectional Long Short-Term Memory Network (BiLSTM) model, an end-to-end serial combination of a BiLSTM and (CNN) is constructed to classify and recognize the hand movement based on the raw EEG data. Main results . According to the experimental results, the model is able to categorize four types of hand movements, picking up movements, pushing movements, and four target direction displacement movements with an accuracy of 99.14% ± 0.49%, 99.29% ± 0.11%, 99.23% ± 0.60%, and 98.11% ± 0.23%, respectively. Significance . Furthermore, comparative tests conducted with alternative deep learning models (LSTM, CNN, EEGNet, CNN-LSTM) demonstrates that the CNN-BiLSTM model is with practicable accuracy in terms of EEG-based hand movement recognition and its parameter decoding.
Bibliography:JNE-108433
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1741-2560
1741-2552
1741-2552
DOI:10.1088/1741-2552/adba8a