Interpretability of SurvivalBoost upon Shapley Additive Explanation value on medical data

Machine learning methods have been extensively used in survival analysis. SurvivalBoost - a machine-learning-based survival regression algorithm, focused on Elastic-net-Type penalized semiparametric Cox regression model on XGBoost and random survival forests, has been verified its superior predictio...

Full description

Saved in:
Bibliographic Details
Published inCommunications in statistics. Simulation and computation Vol. 53; no. 7; pp. 3058 - 3067
Main Authors Wang, Yating, Su, Jinxia, Zhao, Xuejing
Format Journal Article
LanguageEnglish
Published Philadelphia Taylor & Francis 02.07.2024
Taylor & Francis Ltd
Subjects
Online AccessGet full text
ISSN0361-0918
1532-4141
DOI10.1080/03610918.2022.2094962

Cover

Loading…
More Information
Summary:Machine learning methods have been extensively used in survival analysis. SurvivalBoost - a machine-learning-based survival regression algorithm, focused on Elastic-net-Type penalized semiparametric Cox regression model on XGBoost and random survival forests, has been verified its superior prediction performance on real and simulated datasets. Whereas the interpretability is remain undiscovered. This paper discusses the interpretability of this algorithm upon using the Shapley Additive Explanation (SHAP) value. It is illustrated that the algorithm can be more effective to guide the diagnosis and practice of survival analysis.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0361-0918
1532-4141
DOI:10.1080/03610918.2022.2094962