From Diagnosis to Precision Surgery: the Transformative Role of Artificial Intelligence in Urologic Imaging

The multidisciplinary nature of artificial intelligence (AI) has allowed for rapid growth of its application in medical imaging. Artificial intelligence algorithms can augment various imaging modalities such as X-rays, CT, and MRI to improve image quality and generate high-resolution three-dimension...

Full description

Saved in:
Bibliographic Details
Published inJournal of endourology
Main Authors Khizir, Labeeqa, Bhandari, Vineet, Kaloth, Srivarsha, Pfail, John, Lichtbroun, Benjamin, Yanamala, Naveena, Elsamra, Sammy
Format Journal Article
LanguageEnglish
Published United States 01.08.2024
Online AccessGet more information

Cover

Loading…
More Information
Summary:The multidisciplinary nature of artificial intelligence (AI) has allowed for rapid growth of its application in medical imaging. Artificial intelligence algorithms can augment various imaging modalities such as X-rays, CT, and MRI to improve image quality and generate high-resolution three-dimensional images. AI reconstruction of three-dimensional models of patient anatomy from CT or MRI scans can better enable urologists to visualize structures and accurately plan surgical approaches. AI can also be optimized to create virtual reality simulations of surgical procedures based on patient-specific data, giving urologists more hands-on experience and preparation. Recent development of artificial intelligence modalities such as TeraRecon and Ceevra offer rapid and efficient medical imaging analyses aimed at enhancing the provision of urologic care, notably for intra-operative guidance during robotic-assisted radical prostatectomy and partial nephrectomy. Notably, use of 3-D VR models has been linked to improved operative times, shorter hospital stay, reduced clamp time, and minimized blood loss in patients undergoing robotic assisted laparoscopic partial nephrectomy when compared to standard operative approaches that do not utilize VR technologies.
ISSN:1557-900X
DOI:10.1089/end.2023.0695