Knowledge based decision tree construction with feature importance domain knowledge
Decision Tree is a widely used supervised learning algorithm due its many advantages like fast non parametric learning, comprehensibility and son. But, Decision Tree require large training set to learn accurately because, decision tree algorithms recursively partition the data set that leaves very f...
Saved in:
Published in | 2012 7th International Conference on Electrical and Computer Engineering pp. 659 - 662 |
---|---|
Main Authors | , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.12.2012
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Decision Tree is a widely used supervised learning algorithm due its many advantages like fast non parametric learning, comprehensibility and son. But, Decision Tree require large training set to learn accurately because, decision tree algorithms recursively partition the data set that leaves very few instances in the lower levels of the tree. In order to address this drawback, we present a novel algorithm named Importance Aided Decision Tree (IADT). It that takes Feature Importance as an additional domain knowledge. Additional domain knowledge have been shown to enhance the performance of learners. Decision Tree algorithm always finds the most important attributes in each node. Thus, Importance of features is a relevant domain knowledge for decision tree algorithm. Our algorithm uses a novel approach to incorporate this feature importance score into decision tree learning. This approach makes decision trees more accurate and robust. We presented theoretical and empirical performance analysis to show that IADT is superior to standard decision tree learning algorithms. |
---|---|
ISBN: | 146731434X 9781467314343 |
DOI: | 10.1109/ICECE.2012.6471636 |