EEG Based Automated Detection of Six Different Eye Movement Conditions for Implementation in Personal Assistive Application

Different forms of human expressions are now being extensively used in present-day human–machine interfaces to provide assistive support to the elderly and disabled population. Depending on the subject condition, expressions conveyed in terms of eye movements are often found to provide the most effi...

Full description

Saved in:
Bibliographic Details
Published inWireless personal communications Vol. 124; no. 1; pp. 909 - 930
Main Authors Paul, Avishek, Chakraborty, Abhishek, Sadhukhan, Deboleena, Pal, Saurabh, Mitra, Madhuchhanda
Format Journal Article
LanguageEnglish
Published New York Springer US 01.05.2022
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Different forms of human expressions are now being extensively used in present-day human–machine interfaces to provide assistive support to the elderly and disabled population. Depending on the subject condition, expressions conveyed in terms of eye movements are often found to provide the most efficient way of communication. Nowadays, standard Electroencephalogram (EEG) based arrangements, normally used to analyze neurological states are also being adopted for the detection of eye movements. Although, EEG-based recent state-of-the-arts researches are lagging as the majority of the works either detects eye movements in a lesser direction or uses a higher feature dimension with limited classification accuracy. In this study, a robust, simple and automated algorithm is proposed that exploits the analysis of the EEG signal to classify eye movements in six different directions. The algorithm uses discrete wavelet transformation to denoise the EEG signals acquired from six different leads. Then, from the reconstructed wavelet coefficients of each lead, two features are extracted and combined to form a binary feature map. The obtained binary feature map itself facilitates distinct visual classification of the eye movements. Finally, a unique value generated from the calculated weighted sum of the binary map is used to classify six different types of eye movements via a threshold-based classification technique. The algorithm presents high average accuracy, sensitivity, specificity of 95.85%, 95.83% and 95.83% respectively, using a single value only. Compared to other state-of-the-art methods, the adopted unique binarization methodology and the obtained results indicate the immense potential of the proposed algorithm to be implemented in personal assistive applications.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0929-6212
1572-834X
DOI:10.1007/s11277-021-09389-w