Elastic Net Regularization Paths for All Generalized Linear Models

The lasso and elastic net are popular regularized regression models for supervised learning. Friedman, Hastie, and Tibshirani (2010) introduced a computationally efficient algorithm for computing the elastic net regularization path for ordinary least squares regression, logistic regression and multi...

Full description

Saved in:
Bibliographic Details
Published inJournal of statistical software Vol. 106; no. 1
Main Authors Tay, J. Kenneth, Narasimhan, Balasubramanian, Hastie, Trevor
Format Journal Article
LanguageEnglish
Published United States Foundation for Open Access Statistics 2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The lasso and elastic net are popular regularized regression models for supervised learning. Friedman, Hastie, and Tibshirani (2010) introduced a computationally efficient algorithm for computing the elastic net regularization path for ordinary least squares regression, logistic regression and multinomial logistic regression, while Simon, Friedman, Hastie, and Tibshirani (2011) extended this work to Cox models for right-censored data. We further extend the reach of the elastic net-regularized regression to all generalized linear model families, Cox models with (start, stop] data and strata, and a simplified version of the relaxed lasso. We also discuss convenient utility functions for measuring the performance of these fitted models.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1548-7660
1548-7660
DOI:10.18637/jss.v106.i01