A Safe Feature Screening Rule for Rank Lasso

To deal with outliers or heavy-tailed random errors in common high-dimensional data sets, robust regressions are preferable selections and Rank Lasso is a notable model among them. However, the large-scaled feature size in data set increases the computational cost of solving Rank Lasso. In this pape...

Full description

Saved in:
Bibliographic Details
Published inIEEE signal processing letters Vol. 29; pp. 1062 - 1066
Main Authors Shang, Pan, Kong, Lingchen, Liu, Dashuai
Format Journal Article
LanguageEnglish
Published New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:To deal with outliers or heavy-tailed random errors in common high-dimensional data sets, robust regressions are preferable selections and Rank Lasso is a notable model among them. However, the large-scaled feature size in data set increases the computational cost of solving Rank Lasso. In this paper, we build up a safe feature screening rule for Rank Lasso, which can effectively and safely identify inactive features in data sets and reduce the computation time of this model. The advantage of our screening rule is that it can be expressed as the closed-form function of given data and is easily implemented. The proposed screening rule is evaluated on some simulation and real data sets, which show that it can safely discard inactive features with a small computational cost and reduce the time for solving Rank Lasso.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2022.3167918