Robotically aligned optical coherence tomography with 5 degree of freedom eye tracking for subject motion and gaze compensation

Optical coherence tomography (OCT) has revolutionized diagnostics in ophthalmology. However, OCT requires a trained operator and patient cooperation to carefully align a scanner with the subject's eye and orient it in such a way that it images a desired region of interest at the retina. With th...

Full description

Saved in:
Bibliographic Details
Published inBiomedical optics express Vol. 12; no. 12; pp. 7361 - 7376
Main Authors Ortiz, Pablo, Draelos, Mark, Viehland, Christian, Qian, Ruobing, McNabb, Ryan P, Kuo, Anthony N, Izatt, Joseph A
Format Journal Article
LanguageEnglish
Published United States Optical Society of America 01.12.2021
Online AccessGet full text

Cover

Loading…
More Information
Summary:Optical coherence tomography (OCT) has revolutionized diagnostics in ophthalmology. However, OCT requires a trained operator and patient cooperation to carefully align a scanner with the subject's eye and orient it in such a way that it images a desired region of interest at the retina. With the goal of automating this process of orienting and aligning the scanner, we developed a robot-mounted OCT scanner that automatically aligned with the pupil while matching its optical axis with the target region of interest at the retina. The system used two 3D cameras for face tracking and three high-resolution 2D cameras for pupil and gaze tracking. The tracking software identified 5 degrees of freedom for robot alignment and ray aiming through the ocular pupil: 3 degrees of translation and 2 degrees of orientation ( , ). We evaluated the accuracy, precision, and range of our tracking system and demonstrated imaging performance on free-standing human subjects. Our results demonstrate that the system stabilized images and that the addition of gaze tracking and aiming allowed for region-of-interest specific alignment at any gaze orientation within a 28° range.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2156-7085
2156-7085
DOI:10.1364/BOE.443537