ChatGPT and most frequent urological diseases: analysing the quality of information and potential risks for patients
Purpose Artificial intelligence (AI) is a set of systems or combinations of algorithms, which mimic human intelligence. ChatGPT is software with artificial intelligence which was recently developed by OpenAI. One of its potential uses could be to consult the information about pathologies and treatme...
Saved in:
Published in | World journal of urology Vol. 41; no. 11; pp. 3149 - 3153 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Berlin/Heidelberg
Springer Berlin Heidelberg
01.11.2023
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Purpose
Artificial intelligence (AI) is a set of systems or combinations of algorithms, which mimic human intelligence. ChatGPT is software with artificial intelligence which was recently developed by OpenAI. One of its potential uses could be to consult the information about pathologies and treatments. Our objective was to assess the quality of the information provided by AI like ChatGPT and establish if it is a secure source of information for patients.
Methods
Questions about bladder cancer, prostate cancer, renal cancer, benign prostatic hypertrophy (BPH), and urinary stones were queried through ChatGPT 4.0. Two urologists analysed the responses provided by ChatGPT using DISCERN questionary and a brief instrument for evaluating the quality of informed consent documents.
Results
The overall information provided in all pathologies was well-balanced. In each pathology was explained its anatomical location, affected population and a description of the symptoms. It concluded with the established risk factors and possible treatment. All treatment answers had a moderate quality score with DISCERN (3 of 5 points). The answers about surgical options contain the recovery time, type of anaesthesia, and potential complications. After analysing all the responses related to each disease, all pathologies except BPH achieved a DISCERN score of 4.
Conclusions
ChatGPT information should be used with caution since the chatbot does not disclose the sources of information and may contain bias even with simple questions related to the basics of urologic diseases. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 1433-8726 0724-4983 1433-8726 |
DOI: | 10.1007/s00345-023-04563-0 |