Decision-Tree-Initialized Dendritic Neuron Model for Fast and Accurate Data Classification

This work proposes a decision tree (DT)-based method for initializing a dendritic neuron model (DNM). Neural networks become larger and larger, thus consuming more and more computing resources. This calls for a strong need to prune neurons that do not contribute much to their network's output....

Full description

Saved in:
Bibliographic Details
Published inIEEE transaction on neural networks and learning systems Vol. 33; no. 9; pp. 4173 - 4183
Main Authors Luo, Xudong, Wen, Xiaohao, Zhou, MengChu, Abusorrah, Abdullah, Huang, Lukui
Format Journal Article
LanguageEnglish
Published United States IEEE 01.09.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This work proposes a decision tree (DT)-based method for initializing a dendritic neuron model (DNM). Neural networks become larger and larger, thus consuming more and more computing resources. This calls for a strong need to prune neurons that do not contribute much to their network's output. Pruning those with low contribution may lead to a loss of accuracy of DNM. Our proposed method is novel because 1) it can reduce the number of dendrites in DNM while improving training efficiency without affecting accuracy and 2) it can select proper initialization weight and threshold of neurons. The Adam algorithm is used to train DNM after its initialization with our proposed DT-based method. To verify its effectiveness, we apply it to seven benchmark datasets. The results show that decision-tree-initialized DNM is significantly better than the original DNM, k-nearest neighbor, support vector machine, back-propagation neural network, and DT classification methods. It exhibits the lowest model complexity and highest training speed without losing any accuracy. The interactions among attributes can also be observed in its dendritic neurons.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2162-237X
2162-2388
2162-2388
DOI:10.1109/TNNLS.2021.3055991