A Novel Small-data based Approach for Decoding Yes/No-Decisions of Locked-in Patients Using Generative Adversarial Networks
We demonstrate how to use generative adversarial networks to improve the small data problem when training brain-computer-interfaces. The new approach is based on finely graded frequency bands, which are extracted from motor imagery electroencephalography data by using power spectral density method t...
Saved in:
Published in | IEEE access Vol. 11; p. 1 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
01.01.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We demonstrate how to use generative adversarial networks to improve the small data problem when training brain-computer-interfaces. The new approach is based on finely graded frequency bands, which are extracted from motor imagery electroencephalography data by using power spectral density method to synthetically generate electroencephalography data using generative adversarial networks. We evaluate our approach using one of the currently largest publicly available electroencephalography datasets, by first checking the synthetic and real data for statistical and visual similarity, and secondly, by training a random forest classifier, once using only the real data and then using the real data augmented with the synthetic data. With similarity scores of 95.72 % in the subject-dependent case and 83.51 % in the subject-independent case, and a predictive gain of 17.53 % in the subject-dependent case, and 7.51 % in the subject-independent case, we were able to achieve promising results. The results show that our approach can make it possible to research rare diseases for which there is too little patient data. Also, synthetic data can be a way for many electroencephalography-based brain-computer interface applications to obtain the required data more cost- and time-efficiently. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2023.3326720 |