Removing Neural Signal Artifacts with Autoencoder-Targeted Adversarial Transformers (AT-AT)
Electromyogenic (EMG) noise is a major contamination source in EEG data that can impede accurate analysis of brain-specific neural activity. Recent literature on EMG artifact removal has moved beyond traditional linear algorithms in favor of machine learning-based systems. However, existing deep lea...
Saved in:
Main Author | |
---|---|
Format | Journal Article |
Language | English |
Published |
07.02.2025
|
Subjects | |
Online Access | Get full text |
DOI | 10.48550/arxiv.2502.05332 |
Cover
Loading…
Summary: | Electromyogenic (EMG) noise is a major contamination source in EEG data that
can impede accurate analysis of brain-specific neural activity. Recent
literature on EMG artifact removal has moved beyond traditional linear
algorithms in favor of machine learning-based systems. However, existing deep
learning-based filtration methods often have large compute footprints and
prohibitively long training times. In this study, we present a new machine
learning-based system for filtering EMG interference from EEG data using an
autoencoder-targeted adversarial transformer (AT-AT). By leveraging the
lightweight expressivity of an autoencoder to determine optimal time-series
transformer application sites, our AT-AT architecture achieves a >90% model
size reduction compared to published artifact removal models. The addition of
adversarial training ensures that filtered signals adhere to the fundamental
characteristics of EEG data. We trained AT-AT using published neural data from
67 subjects and found that the system was able to achieve comparable test
performance to larger models; AT-AT posted a mean reconstructive correlation
coefficient above 0.95 at an initial signal-to-noise ratio (SNR) of 2 dB and
0.70 at -7 dB SNR. Further research generalizing these results to broader
sample sizes beyond these isolated test cases will be crucial; while outside
the scope of this study, we also include results from a real-world deployment
of AT-AT in the Appendix. |
---|---|
DOI: | 10.48550/arxiv.2502.05332 |