Unsupervised detection of microsaccades in a high-noise regime

Micromovements of the eye during visual fixations provide clues about how our visual system acquires information. The analysis of fixational eye movements can thus serve as a noninvasive means to detect agerelated or pathological changes in visual processing, which can in turn reflect associated cog...

Full description

Saved in:
Bibliographic Details
Published inJournal of vision (Charlottesville, Va.) Vol. 18; no. 6
Main Authors Sheynikhovich, Denis, Bécu, Marcia, Wu, Changmin, Arleo, Angelo
Format Journal Article
LanguageEnglish
Published Association for Research in Vision and Ophthalmology 01.06.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Micromovements of the eye during visual fixations provide clues about how our visual system acquires information. The analysis of fixational eye movements can thus serve as a noninvasive means to detect agerelated or pathological changes in visual processing, which can in turn reflect associated cognitive or neurological disorders. However, the utility of such diagnostic approaches relies on the quality and usability of detection methods applied for the eye movement analysis. Here, we propose a novel method for (micro)saccade detection that is resistant to highfrequency recording noise, a frequent problem in videobased eye tracking in either aged subjects or subjects suffering from a vision-related pathology. The method is fast, it does not require manual noise removal, and it can work with position, velocity, or acceleration features, or a combination thereof. The detection accuracy of the proposed method is assessed on a new dataset of manually labeled recordings acquired from 14 subjects of advanced age (69-81 years old), performing an ocular fixation task. It is demonstrated that the detection accuracy of the new method compares favorably to that of two frequently used reference methods and that it is comparable to the best of the two algorithms when tested on an existing low-noise eye-tracking dataset.
ISSN:1534-7362
1534-7362
DOI:10.1167/18.6.19