Deep Learning Model for Analyzing EEG Signal Analysis

To analyze the physiological information within the acquired EEG signal is very cumbersome due to the possibility of several factors, viz. noise and artifacts, complexity of brain dynamics, and inter-subject variability. To address these issues, this paper compares a U-shaped encoder-decoder network...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 13; pp. 91034 - 91045
Main Authors Gupta, Varun, Kumar, Vivek, Prince, Singh, Saurabh, Lee, Young-Seok, Ra, In-Ho
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:To analyze the physiological information within the acquired EEG signal is very cumbersome due to the possibility of several factors, viz. noise and artifacts, complexity of brain dynamics, and inter-subject variability. To address these issues, this paper compares a U-shaped encoder-decoder network (UNET) and Bat-based UNET signal analysis (BUSA) techniques to classify depression rates in the Electroencephalogram (EEG) datasets. The main objective of including these two techniques is to reveal their effectiveness. It comprises pre-processing, feature extraction, feature selection, and classification stages. The framework excels at noise reduction during pre-processing, enhancing dataset integrity. Feature extraction leverages band power and correlation dimension to extract crucial features. Furthermore, feature selection optimizes classification accuracy by refining the fitness function of bats in the classification layer. The performance of UNET and BUSA are compared based on the following performance evaluating parameters viz. accuracy (Acc), Area Under the Curve (AUC), precision (P), and recall (R) (or sensitivity (Se)). The results indicated that the BUSA technique outperforms the UNET technique.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2025.3563760