Interpretability of SurvivalBoost upon Shapley Additive Explanation value on medical data
Machine learning methods have been extensively used in survival analysis. SurvivalBoost - a machine-learning-based survival regression algorithm, focused on Elastic-net-Type penalized semiparametric Cox regression model on XGBoost and random survival forests, has been verified its superior predictio...
Saved in:
Published in | Communications in statistics. Simulation and computation Vol. 53; no. 7; pp. 3058 - 3067 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Philadelphia
Taylor & Francis
02.07.2024
Taylor & Francis Ltd |
Subjects | |
Online Access | Get full text |
ISSN | 0361-0918 1532-4141 |
DOI | 10.1080/03610918.2022.2094962 |
Cover
Loading…
Summary: | Machine learning methods have been extensively used in survival analysis. SurvivalBoost - a machine-learning-based survival regression algorithm, focused on Elastic-net-Type penalized semiparametric Cox regression model on XGBoost and random survival forests, has been verified its superior prediction performance on real and simulated datasets. Whereas the interpretability is remain undiscovered. This paper discusses the interpretability of this algorithm upon using the Shapley Additive Explanation (SHAP) value. It is illustrated that the algorithm can be more effective to guide the diagnosis and practice of survival analysis. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0361-0918 1532-4141 |
DOI: | 10.1080/03610918.2022.2094962 |