A semi-simulated EEG/EOG dataset for the comparison of EOG artifact rejection techniques

Artifact rejection techniques are used to recover the brain signals underlying artifactual electroencephalographic (EEG) segments. Although over the last few years many different artifact rejection techniques have been proposed (http://dx.doi.org/10.1109/JSEN.2011.2115236[1], http://dx.doi.org/10.10...

Full description

Saved in:
Bibliographic Details
Published inData in brief Vol. 8; pp. 1004 - 1006
Main Authors Klados, Manousos A., Bamidis, Panagiotis D.
Format Journal Article
LanguageEnglish
Published Netherlands Elsevier Inc 01.09.2016
Elsevier
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Artifact rejection techniques are used to recover the brain signals underlying artifactual electroencephalographic (EEG) segments. Although over the last few years many different artifact rejection techniques have been proposed (http://dx.doi.org/10.1109/JSEN.2011.2115236[1], http://dx.doi.org/10.1016/j.clinph.2006.09.003[2], http://dx.doi.org/10.3390/e16126553[3]), none has been established as a gold standard so far, because assessing their performance is difficult and subjective (http://dx.doi.org/10.1109/ITAB.2009.5394295[4], http://dx.doi.org/10.1016/j.bspc.2011.02.001[5], http://dx.doi.org/10.1007/978-3-540-89208-3_300. [6]). This limitation is mainly based on the fact that the underlying artifact-free brain signal is unknown, so there is no objective way to measure how close the retrieved signal is to the real one. This article solves the aforementioned problem by presenting a semi-simulated EEG dataset, where artifact-free EEG signals are manually contaminated with ocular artifacts, using a realistic head model. The significant part of this dataset is that it contains the pre-contamination EEG signals, so the brain signals underlying the EOG artifacts are known and thus the performance of every artifact rejection technique can be objectively assessed.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2352-3409
2352-3409
DOI:10.1016/j.dib.2016.06.032