Decision tree modeling using R

In machine learning field, decision tree learner is powerful and easy to interpret. It employs recursive binary partitioning algorithm that splits the sample in partitioning variable with the strongest association with the response variable. The process continues until some stopping criteria are met...

Full description

Saved in:
Bibliographic Details
Published inAnnals of translational medicine Vol. 4; no. 15; p. 275
Main Author Zhang, Zhongheng
Format Journal Article
LanguageEnglish
Published China AME Publishing Company 01.08.2016
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In machine learning field, decision tree learner is powerful and easy to interpret. It employs recursive binary partitioning algorithm that splits the sample in partitioning variable with the strongest association with the response variable. The process continues until some stopping criteria are met. In the example I focus on conditional inference tree, which incorporates tree-structured regression models into conditional inference procedures. While growing a single tree is subject to small changes in the training data, random forests procedure is introduced to address this problem. The sources of diversity for random forests come from the random sampling and restricted set of input variables to be selected. Finally, I introduce R functions to perform model based recursive partitioning. This method incorporates recursive partitioning into conventional parametric model building.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2305-5839
2305-5839
DOI:10.21037/atm.2016.05.14