Development of a colon endoscope robot that adjusts its locomotion through the use of reinforcement learning

Purpose Fibre optic colonoscopy is usually performed with manual introduction and advancement of the endoscope, but there is potential for a robot capable of locomoting autonomously from the rectum to the caecum. A prototype robot was designed and tested. Methods The robot colonic endoscope consists...

Full description

Saved in:
Bibliographic Details
Published inInternational journal for computer assisted radiology and surgery Vol. 5; no. 4; pp. 317 - 325
Main Authors Trovato, G., Shikanai, M., Ukawa, G., Kinoshita, J., Murai, N., Lee, J. W., Ishii, H., Takanishi, A., Tanoue, K., Ieiri, S., Konishi, K., Hashizume, M.
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer-Verlag 01.07.2010
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Purpose Fibre optic colonoscopy is usually performed with manual introduction and advancement of the endoscope, but there is potential for a robot capable of locomoting autonomously from the rectum to the caecum. A prototype robot was designed and tested. Methods The robot colonic endoscope consists in a front body with clockwise helical fin and a rear body with anticlockwise one, both connected via a DC motor. Input voltage is adjusted automatically by the robot, through the use of reinforcement learning, determining speed and direction (forward or backward). Results Experiments were performed both in-vitro and in-vivo, showing the feasibility of the robot. The device is capable of moving in a slippery environment, and reinforcement learning algorithms such as Q-learning and SARSA can obtain better results than simply applying full tension to the robot. Conclusions This self-propelled robotic endoscope has potential as an alternative to current fibre optic colonoscopy examination methods, especially with the addition of new sensors under development.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1861-6410
1861-6429
DOI:10.1007/s11548-010-0481-0