Robust Variable Selection Criteria for the Penalized Regression
We propose a robust variable selection procedure using a divergence based M-estimator combined with a penalty function. It produces robust estimates of the regression parameters and simultaneously selects the important explanatory variables. An efficient algorithm based on the quadratic approximatio...
Saved in:
Published in | arXiv.org |
---|---|
Main Authors | , |
Format | Paper |
Language | English |
Published |
Ithaca
Cornell University Library, arXiv.org
29.12.2019
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We propose a robust variable selection procedure using a divergence based M-estimator combined with a penalty function. It produces robust estimates of the regression parameters and simultaneously selects the important explanatory variables. An efficient algorithm based on the quadratic approximation of the estimating equation is constructed. The asymptotic distribution and the influence function of the regression coefficients are derived. The widely used model selection procedures based on the Mallows's \(C_p\) statistic and Akaike information criterion (AIC) often show very poor performance in the presence of heavy-tailed error or outliers. For this purpose, we introduce robust versions of these information criteria based on our proposed method. The simulation studies show that the robust variable selection technique outperforms the classical likelihood-based techniques in the presence of outliers. The performance of the proposed method is also explored through the real data analysis. |
---|---|
ISSN: | 2331-8422 |