Progressively Balanced Multi-class Neural Trees

Decision trees are discriminative classifiers that hierarchically partition the input space to achieve regions containing instances having uniform class label. Existing works in this area have mostly focused on C4.S trees that learn axis aligned partitions. On the other hand, neural trees learn obli...

Full description

Saved in:
Bibliographic Details
Published in2018 Twenty Fourth National Conference on Communications (NCC) pp. 1 - 6
Main Authors Godbole, Ameya, Bhat, Spoorthy, Guha, Prithwijit
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.02.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Decision trees are discriminative classifiers that hierarchically partition the input space to achieve regions containing instances having uniform class label. Existing works in this area have mostly focused on C4.S trees that learn axis aligned partitions. On the other hand, neural trees learn oblique partitions from data and use lesser number of decision nodes hosting perceptrons. However, these perceptrons are susceptible to data imbalances. This motivated us to propose a progressively balanced neural tree where training dataset are balanced prior to perceptron learning. The second contribution is the optimization of the decision function with respect to entropy impurity based objective functions. This formulation also allows a parent node to have more than two child nodes. The proposed algorithm is benchmarked on ten standard datasets against three baseline multi-class classification algorithms.
DOI:10.1109/NCC.2018.8599945