Pareto parameter estimation by merging locally weighted median of multiple neural networks and weighted least squares

The Pareto distribution plays an important role in many data analysis tasks. An important aspect of this distribution is the estimation of its parameters. Several studies use classical methods, Bayes, and the neural network (NN) to evaluate Pareto parameters. Others have attempted to combine classic...

Full description

Saved in:
Bibliographic Details
Published inAlexandria engineering journal Vol. 87; pp. 524 - 532
Main Authors Aydi, Walid, Alatiyyah, Mohammed
Format Journal Article
LanguageEnglish
Published Elsevier 01.01.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The Pareto distribution plays an important role in many data analysis tasks. An important aspect of this distribution is the estimation of its parameters. Several studies use classical methods, Bayes, and the neural network (NN) to evaluate Pareto parameters. Others have attempted to combine classical methods with a single NN-based. However, there isn’t enough research to determine the sensitivity of the single NN to the specifics of the training data due to its stochastic training algorithm in the parameter estimation field. The current research aims to prove the efficiency of the aggregation of weighted multiple NN models and weighted ordinary least-squares regression algorithm to overcome the specifics of the training data and the sensitivity to outliers, respectively. The proposed method enables a locally less accurate model to participate to a lesser extent in the overall aggregation. The proposed method was compared with prevalent methods in the area, including the ordinary least squares, weighted ordinary least squares, maximum likelihood estimation, and the Bayes’ using Monte Carlo simulations. The results verified the superiority of the proposed method in terms of regression error metrics. Moreover, it can be adapted to a variety of distributions.
ISSN:1110-0168
DOI:10.1016/j.aej.2023.12.063