Anatomy aware-based 2.5D bronchoscope tracking for image-guided bronchoscopic navigation
Physicians use an endoscopic navigation system during bronchoscopy to decrease the risk of getting lost in complex tree-structure like bronchus. Most existing navigation systems based on the camera pose estimated from bronchoscope tracking and/or deep learning. However, bronchoscope tracking-based m...
Saved in:
Published in | Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization Vol. 11; no. 4; pp. 1122 - 1129 |
---|---|
Main Authors | , , , , , , , , , |
Format | Journal Article |
Language | English Japanese |
Published |
Taylor & Francis
04.07.2023
Informa UK Limited |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Physicians use an endoscopic navigation system during bronchoscopy to decrease the risk of getting lost in complex tree-structure like bronchus. Most existing navigation systems based on the camera pose estimated from bronchoscope tracking and/or deep learning. However, bronchoscope tracking-based method exists tracking error, and the pre-training of the model needs massive data. This paper describes an improved bronchoscope tracking procedure by adopting image domain translation technique to improve tracking performance. Specifically, our scheme consists of three modules, an RGB-D image domain translation module, an anatomical structure classification module and a structure-aware bronchoscope tracking module. The RGB-D image domain translation module translates a real bronchoscope (RB) image to its corresponding virtual bronchoscope image and depth image. The anatomical dependency module classifies the current scene into two categories: structureless and rich structure. The bronchoscope tracking module uses a modified video-CT bronchoscope tracking approach to estimate camera pose. Experimental results showed that the proposed method achieved higher tracking accuracy than the current state-of-the-art bronchoscope tracking methods. |
---|---|
ISSN: | 2168-1163 2168-1171 |
DOI: | 10.1080/21681163.2022.2152728 |