Real‐time fundus reconstruction and intraocular mapping using an ophthalmic endoscope

Background Robotic ophthalmic endoscope holders allow surgeons to execute dual‐hand operations in eye surgery. To prevent needle‐like endoscopes from invading the retina when moving, surgeons expect visual and real‐time information about the relative special relationship between the endoscope and fu...

Full description

Saved in:
Bibliographic Details
Published inThe international journal of medical robotics + computer assisted surgery Vol. 19; no. 3; pp. e2496 - n/a
Main Authors Zhou, Dongbo, Takeyama, Hayato, Nakao, Shintaro, Sonoda, Koh‐Hei, Tadano, Kotaro
Format Journal Article
LanguageEnglish
Published England Wiley Subscription Services, Inc 01.06.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Background Robotic ophthalmic endoscope holders allow surgeons to execute dual‐hand operations in eye surgery. To prevent needle‐like endoscopes from invading the retina when moving, surgeons expect visual and real‐time information about the relative special relationship between the endoscope and fundus. Methods This study develops a real‐time fundus reconstruction method. First, using deep learning, the method estimates the distance between the fundus part corresponding to every pixel of the RGB endoscopic image and the endoscope. Then, by combining the estimated distance with the kinematics of a robotic holder, the point cloud representing the present fundus area is generated, and by which the size and position of the eyeball are estimated. Results This method shows a real‐time frequency of 10 Hz, which is robust to eyeball movement. The error of fundus reconstruction is about 0.5 mm, and the error of eyeball estimation is about 1 mm. Conclusion Using this fundus reconstruction method can map the position of the endoscope inside the eyeball when using a robotic endoscope holder in eye surgery. The overall accuracy level meets the ophthalmologists' accuracy requirements of ophthalmologists.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1478-5951
1478-596X
DOI:10.1002/rcs.2496