AnEEG: leveraging deep learning for effective artifact removal in EEG data

In neuroscience and clinical diagnostics, electroencephalography (EEG) is a crucial instrument for capturing neural activity. However, this signal is polluted by different artifacts like muscle activity, eye blinks, environmental interference, etc., which makes it more difficult to retrieve importan...

Full description

Saved in:
Bibliographic Details
Published inScientific reports Vol. 14; no. 1; pp. 24234 - 20
Main Authors Kalita, Bhabesh, Deb, Nabamita, Das, Daisy
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 16.10.2024
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In neuroscience and clinical diagnostics, electroencephalography (EEG) is a crucial instrument for capturing neural activity. However, this signal is polluted by different artifacts like muscle activity, eye blinks, environmental interference, etc., which makes it more difficult to retrieve important information from the signal. Deep learning methods have demonstrated the potential to lower these artifacts and enhance the EEG’s quality in recent years. In this work, a novel deep learning method,“AnEEG” is presented for eliminating artifacts from EEG signal. The quantitative matrices NMSE, RMSE, CC, SNR and SAR are calculated to confirm the effectiveness of the proposed model. Through this process, it was found that the suggested model outperformed wavelet decomposition techniques. The model achieves lower NMSE and RMSE values, which indicates better agreement with the original signal. Achieving higher CC values means stronger linear agreement with the ground truth signals. Additionally, the model shows improvements in both SNR and SAR values. Overall, this suggested approach showcases promising results in improving the quality of EEG data by utilizing deep learning.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-024-75091-z