Design and implementation of a new cable‐driven robot for MRI‐guided breast biopsy

Background Breast cancer is one of the most common cancer diagnosed among US women. Early and accurate diagnosis using breast biopsy techniques is essential in detecting cancer. Methods In this paper, we present a new cable‐driven robot for MRI‐guided breast biopsy. A compact three degree‐of‐freedom...

Full description

Saved in:
Bibliographic Details
Published inThe international journal of medical robotics + computer assisted surgery Vol. 16; no. 2; pp. e2063 - n/a
Main Authors Liu, Wenxuan, Yang, Zhiyong, Jiang, Shan, Feng, Di, Zhang, Daguang
Format Journal Article
LanguageEnglish
Published Chichester, UK John Wiley & Sons, Inc 01.04.2020
Wiley Subscription Services, Inc
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Background Breast cancer is one of the most common cancer diagnosed among US women. Early and accurate diagnosis using breast biopsy techniques is essential in detecting cancer. Methods In this paper, we present a new cable‐driven robot for MRI‐guided breast biopsy. A compact three degree‐of‐freedom (DOF) semi‐automated robot driven by ultrasonic motors is designed with non‐magnetic materials. Next, a novel insertion trajectory planning algorithm based on the breast holder that we created is proposed and designed, which can help radiologists locate the lesion and calculate the insertion trajectory. To improve the accuracy of insertion, kinematic analysis and accuracy compensation methods are introduced. Results An experimental study based on image recognition and positioning is performed to validate the performance of the new robot. The results show that the mean position accuracy is 0.7 ± 0.04 mm. Conclusions Application of the new robot can improve breast biopsy accuracy and reduce surgery time.
Bibliography:Funding information
National Natural Science Foundation of China, Grant/Award Numbers: 51775368, 5171101938; Science and Technology Planning Project of Guangdong Province, China, Grant/Award Number: 2017B020210004
ISSN:1478-5951
1478-596X
DOI:10.1002/rcs.2063