STOW-Net: Spatio-Temporal Operation Based Deep Learning Network for Classifying Wavelet Transformed Motor Imagery EEG Signals
Brain-computer interface (BCI) systems rely on capturing characteristics of human brain activity from the electroencephalography (EEG) signals, especially for the reliable classification of motor imagery tasks. For multi-channel EEG signals, it is crucial to precisely capture the spatio-temporal var...
Saved in:
Published in | TENCON ... IEEE Region Ten Conference pp. 860 - 863 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
31.10.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Brain-computer interface (BCI) systems rely on capturing characteristics of human brain activity from the electroencephalography (EEG) signals, especially for the reliable classification of motor imagery tasks. For multi-channel EEG signals, it is crucial to precisely capture the spatio-temporal variation along with the frequency characteristics. Hence, instead of directly operating on raw EEG data, in this paper, discrete wavelet transform (DWT) is first applied to the motor-imagery multi-channel EEG data and then a deep learning architecture is designed incorporating spatial-temporal operations, which operates on the DWT-transformed EEG signal. In the proposed architecture, temporal convolution followed by spatial convolution is performed on the DWT-operated MI-EEG signal, and this part is termed as SAT-net. Next, by considering all channels together convolutional operation is performed to reduce the number of channels and this part is termed as SOC-net. Finally, a fully connected layer is used to classify the MI-EEG data from the derived feature vector. Extensive experimentation is performed on multiple subjects taken from the MI-based EEG dataset BCI Competition IV 2a. It is found that the proposed model offers a classification accuracy of 84.65%, consistently providing better classification performance than that obtained by some state-of-the-art methods. |
---|---|
ISSN: | 2159-3450 |
DOI: | 10.1109/TENCON58879.2023.10322450 |