Is ChatGPT knowledgeable of acute coronary syndromes and pertinent European Society of Cardiology Guidelines?

Advancements in artificial intelligence are being seen in multiple fields, including medicine, and this trend is likely to continue going forward. To analyze the accuracy and reproducibility of ChatGPT answers about acute coronary syndromes (ACS). The questions asked to ChatGPT were prepared in two...

Full description

Saved in:
Bibliographic Details
Published inMinerva cardiology and angiology Vol. 72; no. 3; p. 299
Main Authors Gurbuz, Dogac C, Varis, Eser
Format Journal Article
LanguageEnglish
Published Italy 01.06.2024
Subjects
Online AccessGet more information

Cover

Loading…
More Information
Summary:Advancements in artificial intelligence are being seen in multiple fields, including medicine, and this trend is likely to continue going forward. To analyze the accuracy and reproducibility of ChatGPT answers about acute coronary syndromes (ACS). The questions asked to ChatGPT were prepared in two categories. A list of frequently asked questions (FAQs) created from inquiries asked by the public and while preparing the scientific question list, 2023 European Society of Cardiology (ESC) Guidelines for the management of ACS and ESC Clinical Practice Guidelines were used. Accuracy and reproducibility of ChatGPT responses about ACS were evaluated by two cardiologists with ten years of experience using Global Quality Score (GQS). Eventually, 72 FAQs related to ACS met the study inclusion criteria. In total, 65 (90.3%) ChatGPT answers scored GQS 5, which indicated highest accuracy and proficiency. None of the ChatGPT responses to FAQs about ACS scored GQS 1. In addition, highest accuracy and reliability of ChatGPT answers was obtained for the prevention and lifestyle section with GQS 5 for 19 (95%) answers, and GQS 4 for 1 (5%) answer. In contrast, accuracy and proficiency of ChatGPT answers were lowest for the treatment and management section. Moreover, 68 (88.3%) ChatGPT responses for guideline based questions scored GQS 5. Reproducibility of ChatGPT answers was 94.4% for FAQs and 90.9% for ESC guidelines questions. This study shows for the first time that ChatGPT can give accurate and sufficient responses to more than 90% of FAQs about ACS. In addition, proficiency and correctness of ChatGPT answers about questions depending on ESC guidelines was also substantial.
ISSN:2724-5772
DOI:10.23736/S2724-5683.24.06517-7