A Double-Penalized Estimator to Combat Separation and Multicollinearity in Logistic Regression

When developing prediction models for small or sparse binary data with many highly correlated covariates, logistic regression often encounters separation or multicollinearity problems, resulting serious bias and even the nonexistence of standard maximum likelihood estimates. The combination of separ...

Full description

Saved in:
Bibliographic Details
Published inMathematics (Basel) Vol. 10; no. 20; p. 3824
Main Authors Guan, Ying, Fu, Guang-Hui
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.10.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:When developing prediction models for small or sparse binary data with many highly correlated covariates, logistic regression often encounters separation or multicollinearity problems, resulting serious bias and even the nonexistence of standard maximum likelihood estimates. The combination of separation and multicollinearity makes the task of logistic regression more difficult, and a few studies addressed separation and multicollinearity simultaneously. In this paper, we propose a double-penalized method called lFRE to combat separation and multicollinearity in logistic regression. lFRE combines the logF-type penalty with the ridge penalty. The results indicate that compared with other penalty methods, lFRE can not only effectively remove bias from predicted probabilities but also provide the minimum mean squared prediction error. Aside from that, a real dataset is also employed to test the performance of the lFRE algorithm compared with several existing methods. The result shows that lFRE has strong competitiveness compared with them and can be used as an alternative algorithm in logistic regression to solve separation and multicollinearity problems.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2227-7390
2227-7390
DOI:10.3390/math10203824