Deep Learning Model for Analyzing EEG Signal Analysis
To analyze the physiological information within the acquired EEG signal is very cumbersome due to the possibility of several factors, viz. noise and artifacts, complexity of brain dynamics, and inter-subject variability. To address these issues, this paper compares a U-shaped encoder-decoder network...
Saved in:
Published in | IEEE access Vol. 13; pp. 91034 - 91045 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | To analyze the physiological information within the acquired EEG signal is very cumbersome due to the possibility of several factors, viz. noise and artifacts, complexity of brain dynamics, and inter-subject variability. To address these issues, this paper compares a U-shaped encoder-decoder network (UNET) and Bat-based UNET signal analysis (BUSA) techniques to classify depression rates in the Electroencephalogram (EEG) datasets. The main objective of including these two techniques is to reveal their effectiveness. It comprises pre-processing, feature extraction, feature selection, and classification stages. The framework excels at noise reduction during pre-processing, enhancing dataset integrity. Feature extraction leverages band power and correlation dimension to extract crucial features. Furthermore, feature selection optimizes classification accuracy by refining the fitness function of bats in the classification layer. The performance of UNET and BUSA are compared based on the following performance evaluating parameters viz. accuracy (Acc), Area Under the Curve (AUC), precision (P), and recall (R) (or sensitivity (Se)). The results indicated that the BUSA technique outperforms the UNET technique. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2025.3563760 |