Denoising Autoencoder-Based Feature Extraction to Robust SSVEP-Based BCIs

For subjects with amyotrophic lateral sclerosis (ALS), the verbal and nonverbal communication is greatly impaired. Steady state visually evoked potential (SSVEP)-based brain computer interfaces (BCIs) is one of successful alternative augmentative communications to help subjects with ALS communicate...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 21; no. 15; p. 5019
Main Authors Chen, Yeou-Jiunn, Chen, Pei-Chung, Chen, Shih-Chung, Wu, Chung-Min
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 23.07.2021
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:For subjects with amyotrophic lateral sclerosis (ALS), the verbal and nonverbal communication is greatly impaired. Steady state visually evoked potential (SSVEP)-based brain computer interfaces (BCIs) is one of successful alternative augmentative communications to help subjects with ALS communicate with others or devices. For practical applications, the performance of SSVEP-based BCIs is severely reduced by the effects of noises. Therefore, developing robust SSVEP-based BCIs is very important to help subjects communicate with others or devices. In this study, a noise suppression-based feature extraction and deep neural network are proposed to develop a robust SSVEP-based BCI. To suppress the effects of noises, a denoising autoencoder is proposed to extract the denoising features. To obtain an acceptable recognition result for practical applications, the deep neural network is used to find the decision results of SSVEP-based BCIs. The experimental results showed that the proposed approaches can effectively suppress the effects of noises and the performance of SSVEP-based BCIs can be greatly improved. Besides, the deep neural network outperforms other approaches. Therefore, the proposed robust SSVEP-based BCI is very useful for practical applications.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s21155019