A Novel Hyperparameter-free Approach to Decision Tree Construction that Avoids Overfitting by Design
Decision trees are an extremely popular machine learning technique. Unfortunately, overfitting in decision trees still remains an open issue that sometimes prevents achieving good performance. In this work, we present a novel approach for the construction of decision trees that avoids the overfittin...
Saved in:
Main Authors | , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
04.06.2019
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Decision trees are an extremely popular machine learning technique.
Unfortunately, overfitting in decision trees still remains an open issue that
sometimes prevents achieving good performance. In this work, we present a novel
approach for the construction of decision trees that avoids the overfitting by
design, without losing accuracy. A distinctive feature of our algorithm is that
it requires neither the optimization of any hyperparameters, nor the use of
regularization techniques, thus significantly reducing the decision tree
training time. Moreover, our algorithm produces much smaller and shallower
trees than traditional algorithms, facilitating the interpretability of the
resulting models. |
---|---|
DOI: | 10.48550/arxiv.1906.01246 |