A neural network boosting regression model based on XGBoost

The boosting model is a kind of ensemble learning technology, including XGBoost and GBDT, which take decision trees as weak classifiers and achieve better results in classification and regression problems. The neural network has an excellent performance on image and voice recognition, but its weak i...

Full description

Saved in:
Bibliographic Details
Published inApplied soft computing Vol. 125; p. 109067
Main Authors Dong, Jianwei, Chen, Yumin, Yao, Bingyu, Zhang, Xiao, Zeng, Nianfeng
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.08.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The boosting model is a kind of ensemble learning technology, including XGBoost and GBDT, which take decision trees as weak classifiers and achieve better results in classification and regression problems. The neural network has an excellent performance on image and voice recognition, but its weak interpretability limits on developing a fusion model. By referring to principles and methods of traditional boosting models, we proposed a Neural Network Boosting (NNBoost) regression, which takes shallow neural networks with simple structures as weak classifiers. The NNBoost is a new ensemble learning method, which obtains low regression errors on several data sets. The target loss function of NNBoost is approximated by the Taylor expansion. By inducing the derivative form of NNBoost, we give a gradient descent algorithm. The structure of deep learning is complex, and there are some problems such as gradient disappearing, weak interpretability, and parameters difficult to be adjusted. We use the integration of simple neural networks to alleviate the gradient vanishing problem which is laborious to be solved in deep learning, and conquer the overfitting of a learning algorithm. Finally, through testing on some experiments, the correctness and effectiveness of NNBoost are verified from multiple angles, the effect of multiple shallow neural network fusion is proved, and the development path of boosting idea and deep learning is widened to a certain extent. •The target loss function of NNBoost is approximated by the Taylor expansion.•A gradient descent algorithm is put forward by a derivative form of NNBoost.•The integration of simple neural networks alleviates the gradient vanishing problem.•The algorithm extends the application scope of integration and makes the model not limited to decision tree.•Correctness and effectiveness of NNBoost are verified by experiments from multiple perspectives.
ISSN:1568-4946
1872-9681
DOI:10.1016/j.asoc.2022.109067