Differential geometric least angle regression: a differential geometric approach to sparse generalized linear models

Sparsity is an essential feature of many contemporary data problems. Remote sensing, various forms of automated screening and other high throughput measurement devices collect a large amount of information, typically about few independent statistical subjects or units. In certain cases it is reasona...

Full description

Saved in:
Bibliographic Details
Published inJournal of the Royal Statistical Society. Series B, Statistical methodology Vol. 75; no. 3; pp. 471 - 498
Main Authors Augugliaro, Luigi, Mineo, Angelo M., Wit, Ernst C.
Format Journal Article
LanguageEnglish
Published Oxford, UK Blackwell Publishing Ltd 01.06.2013
Oxford University Press
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Sparsity is an essential feature of many contemporary data problems. Remote sensing, various forms of automated screening and other high throughput measurement devices collect a large amount of information, typically about few independent statistical subjects or units. In certain cases it is reasonable to assume that the underlying process generating the data is itself sparse, in the sense that only a few of the measured variables are involved in the process. We propose an explicit method of monotonically decreasing sparsity for outcomes that can be modelled by an exponential family. In our approach we generalize the equiangular condition in a generalized linear model. Although the geometry involves the Fisher information in a way that is not obvious in the simple regression setting, the equiangular condition turns out to be equivalent with an intuitive condition imposed on the Rao score test statistics. In certain special cases the method can be tweaked to obtain L1-penalized generalized linear model solution paths, but the method itself defines sparsity more directly. Although the computation of the solution paths is not trivial, the method compares favourably with other path following algorithms.
Bibliography:ArticleID:RSSB12000
ark:/67375/WNG-2X6PQGGR-D
istex:A6991866E2200B1F9C1E67C6D96C48D9CC6218AD
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:1369-7412
1467-9868
DOI:10.1111/rssb.12000