The predictive Lasso

We propose a shrinkage procedure for simultaneous variable selection and estimation in generalized linear models (GLMs) with an explicit predictive motivation. The procedure estimates the coefficients by minimizing the Kullback-Leibler divergence of a set of predictive distributions to the correspon...

Full description

Saved in:
Bibliographic Details
Published inStatistics and computing Vol. 22; no. 5; pp. 1069 - 1084
Main Authors Tran, Minh-Ngoc, Nott, David J., Leng, Chenlei
Format Journal Article
LanguageEnglish
Published Boston Springer US 01.09.2012
Subjects
Online AccessGet full text
ISSN0960-3174
1573-1375
DOI10.1007/s11222-011-9279-3

Cover

More Information
Summary:We propose a shrinkage procedure for simultaneous variable selection and estimation in generalized linear models (GLMs) with an explicit predictive motivation. The procedure estimates the coefficients by minimizing the Kullback-Leibler divergence of a set of predictive distributions to the corresponding predictive distributions for the full model, subject to an l 1 constraint on the coefficient vector. This results in selection of a parsimonious model with similar predictive performance to the full model. Thanks to its similar form to the original Lasso problem for GLMs, our procedure can benefit from available l 1 -regularization path algorithms. Simulation studies and real data examples confirm the efficiency of our method in terms of predictive performance on future observations.
ISSN:0960-3174
1573-1375
DOI:10.1007/s11222-011-9279-3