A Novel Hyperparameter-Free Approach to Decision Tree Construction That Avoids Overfitting by Design

Decision trees are an extremely popular machine learning technique. Unfortunately, overfitting in decision trees still remains an open issue that sometimes prevents achieving good performance. In this paper, we present a novel approach for the construction of decision trees that avoids the overfitti...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 7; pp. 99978 - 99987
Main Authors Garcia Leiva, Rafael, Fernandez Anta, Antonio, Mancuso, Vincenzo, Casari, Paolo
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Decision trees are an extremely popular machine learning technique. Unfortunately, overfitting in decision trees still remains an open issue that sometimes prevents achieving good performance. In this paper, we present a novel approach for the construction of decision trees that avoids the overfitting by design, without losing accuracy. A distinctive feature of our algorithm is that it requires neither the optimization of any hyperparameters, nor the use of regularization techniques, thus significantly reducing the decision tree training time. Moreover, our algorithm produces much smaller and shallower trees than traditional algorithms, facilitating the interpretability of the resulting models. For reproducibility, we provide an open source version of the algorithm.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2930235