Preclinical evaluation of a markerless, real-time, augmented reality guidance system for robot-assisted radical prostatectomy
Purpose Intra-operative augmented reality (AR) during surgery can mitigate incomplete cancer removal by overlaying the anatomical boundaries extracted from medical imaging data onto the camera image. In this paper, we present the first such completely markerless AR guidance system for robot-assisted...
Saved in:
Published in | International journal for computer assisted radiology and surgery Vol. 16; no. 7; pp. 1181 - 1188 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Cham
Springer International Publishing
01.07.2021
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Purpose
Intra-operative augmented reality (AR) during surgery can mitigate incomplete cancer removal by overlaying the anatomical boundaries extracted from medical imaging data onto the camera image. In this paper, we present the first such completely markerless AR guidance system for robot-assisted laparoscopic radical prostatectomy (RALRP) that transforms medical data from transrectal ultrasound (TRUS) to endoscope camera image. Moreover, we reduce the total number of transformations by combining the hand–eye and camera calibrations in a single step.
Methods
Our proposed solution requires two transformations: TRUS to robot,
DV
T
TRUS
, and camera projection matrix,
M
(i.e., the transformation from endoscope to camera image frame).
DV
T
TRUS
is estimated by the method proposed in Mohareri et al. (in J Urol 193(1):302–312, 2015).
M
is estimated by selecting corresponding 3D-2D data points in the endoscope and the image coordinate frame, respectively, by using a CAD model of the surgical instrument and a preoperative camera intrinsic matrix with an assumption of a projective camera. The parameters are estimated using Levenberg–Marquardt algorithm. Overall mean re-projection errors (MRE) are reported using simulated and real data using a water bath. We show that
M
can be re-estimated if the focus is changed during surgery.
Results
Using simulated data, we received an overall MRE in the range of 11.69–13.32 pixels for monoscopic and stereo left and right cameras. For the water bath experiment, the overall MRE is in the range of 26.04–30.59 pixels for monoscopic and stereo cameras. The overall system error from TRUS to camera world frame is 4.05 mm. Details of the procedure are given in supplementary material.
Conclusion
We demonstrate a markerless AR guidance system for RALRP that does not need calibration markers and thus has the capability to re-estimate the camera projection matrix if it changes during surgery, e.g., due to a focus change. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 1861-6410 1861-6429 |
DOI: | 10.1007/s11548-021-02419-9 |