Practical guide to SHAP analysis: Explaining supervised machine learning model predictions in drug development

Despite increasing interest in using Artificial Intelligence (AI) and Machine Learning (ML) models for drug development, effectively interpreting their predictions remains a challenge, which limits their impact on clinical decisions. We address this issue by providing a practical guide to SHapley Ad...

Full description

Saved in:
Bibliographic Details
Published inClinical and translational science Vol. 17; no. 11; pp. e70056 - n/a
Main Authors Ponce‐Bobadilla, Ana Victoria, Schmitt, Vanessa, Maier, Corinna S., Mensing, Sven, Stodtmann, Sven
Format Journal Article
LanguageEnglish
Published United States John Wiley & Sons, Inc 01.11.2024
John Wiley and Sons Inc
Wiley
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Despite increasing interest in using Artificial Intelligence (AI) and Machine Learning (ML) models for drug development, effectively interpreting their predictions remains a challenge, which limits their impact on clinical decisions. We address this issue by providing a practical guide to SHapley Additive exPlanations (SHAP), a popular feature‐based interpretability method, which can be seamlessly integrated into supervised ML models to gain a deeper understanding of their predictions, thereby enhancing their transparency and trustworthiness. This tutorial focuses on the application of SHAP analysis to standard ML black‐box models for regression and classification problems. We provide an overview of various visualization plots and their interpretation, available software for implementing SHAP, and highlight best practices, as well as special considerations, when dealing with binary endpoints and time‐series models. To enhance the reader's understanding for the method, we also apply it to inherently explainable regression models. Finally, we discuss the limitations and ongoing advancements aimed at tackling the current drawbacks of the method.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Review-3
content type line 23
ISSN:1752-8054
1752-8062
1752-8062
DOI:10.1111/cts.70056