Decoding Movement Imagination and Execution From Eeg Signals Using Bci-Transfer Learning Method Based on Relation Network

A brain-computer interface (BCI) is used to control external devices for healthy people as well as to rehabilitate motor functions for motor-disabled patients. Decoding movement intention is one of the most significant aspects for performing arm movement tasks using brain signals. Decoding movement...

Full description

Saved in:
Bibliographic Details
Published inICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) pp. 1354 - 1358
Main Authors Lee, Do-Yeun, Jeong, Ji-Hoon, Shim, Kyung-Hwan, Lee, Seong-Whan
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.05.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:A brain-computer interface (BCI) is used to control external devices for healthy people as well as to rehabilitate motor functions for motor-disabled patients. Decoding movement intention is one of the most significant aspects for performing arm movement tasks using brain signals. Decoding movement execution (ME) from electroencephalogram (EEG) signals have shown high performance in previous works, however movement imagination (MI) paradigm-based intention decoding has so far failed to achieve sufficient accuracy. In this study, we focused on a robust MI decoding method with transfer learning for the ME and MI paradigm. We acquired EEG data related to arm reaching for 3D directions. We proposed a BCI-transfer learning method based on a Relation network (BTRN) architecture. Decoding performances showed the highest performance compared to conventional works. We confirmed the possibility of the BTRN architecture to contribute to continuous decoding of MI using ME datasets.
ISSN:2379-190X
DOI:10.1109/ICASSP40776.2020.9052997