The predictive Lasso
We propose a shrinkage procedure for simultaneous variable selection and estimation in generalized linear models (GLMs) with an explicit predictive motivation. The procedure estimates the coefficients by minimizing the Kullback-Leibler divergence of a set of predictive distributions to the correspon...
Saved in:
Published in | Statistics and computing Vol. 22; no. 5; pp. 1069 - 1084 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Boston
Springer US
01.09.2012
|
Subjects | |
Online Access | Get full text |
ISSN | 0960-3174 1573-1375 |
DOI | 10.1007/s11222-011-9279-3 |
Cover
Summary: | We propose a shrinkage procedure for simultaneous variable selection and estimation in generalized linear models (GLMs) with an explicit predictive motivation. The procedure estimates the coefficients by minimizing the Kullback-Leibler divergence of a set of predictive distributions to the corresponding predictive distributions for the full model, subject to an
l
1
constraint on the coefficient vector. This results in selection of a parsimonious model with similar predictive performance to the full model. Thanks to its similar form to the original Lasso problem for GLMs, our procedure can benefit from available
l
1
-regularization path algorithms. Simulation studies and real data examples confirm the efficiency of our method in terms of predictive performance on future observations. |
---|---|
ISSN: | 0960-3174 1573-1375 |
DOI: | 10.1007/s11222-011-9279-3 |