The adaptive BerHu penalty in robust regression

We intend to combine Huber's loss with an adaptive reversed version as a penalty function. The purpose is twofold: first we would like to propose an estimator that is robust to data subject to heavy-tailed errors or outliers. Second we hope to overcome the variable selection problem in the pres...

Full description

Saved in:
Bibliographic Details
Published inJournal of nonparametric statistics Vol. 28; no. 3; pp. 487 - 514
Main Authors Lambert-Lacroix, Sophie, Zwald, Laurent
Format Journal Article
LanguageEnglish
Published Abingdon Taylor & Francis 02.07.2016
Taylor & Francis Ltd
American Statistical Association
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We intend to combine Huber's loss with an adaptive reversed version as a penalty function. The purpose is twofold: first we would like to propose an estimator that is robust to data subject to heavy-tailed errors or outliers. Second we hope to overcome the variable selection problem in the presence of highly correlated predictors. For instance, in this framework, the adaptive least absolute shrinkage and selection operator (lasso) is not a very satisfactory variable selection method, although it is a popular technique for simultaneous estimation and variable selection. We call this new penalty the adaptive BerHu penalty. As for elastic net penalty, small coefficients contribute through their norm to this penalty while larger coefficients cause it to grow quadratically (as ridge regression). We will show that the estimator associated with Huber's loss combined with the adaptive BerHu penalty enjoys theoretical properties in the fixed design context. This approach is compared to existing regularisation methods such as adaptive elastic net and is illustrated via simulation studies and real data.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:1048-5252
1029-0311
DOI:10.1080/10485252.2016.1190359