Reduced rank regression via adaptive nuclear norm penalization
Adaptive nuclear-norm penalization is proposed for low-rank matrix approximation, by which we develop a new reduced-rank estimation method for the general high-dimensional multivariate regression problems. The adaptive nuclear norm of a matrix is defined as the weighted sum of the singular values of...
Saved in:
Main Authors | , , |
---|---|
Format | Journal Article |
Language | English |
Published |
01.01.2012
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Adaptive nuclear-norm penalization is proposed for low-rank matrix
approximation, by which we develop a new reduced-rank estimation method for the
general high-dimensional multivariate regression problems. The adaptive nuclear
norm of a matrix is defined as the weighted sum of the singular values of the
matrix. For example, the pre-specified weights may be some negative power of
the singular values of the data matrix (or its projection in regression
setting). The adaptive nuclear norm is generally non-convex under the natural
restriction that the weight decreases with the singular value. However, we show
that the proposed non-convex penalized regression method has a global optimal
solution obtained from an adaptively soft-thresholded singular value
decomposition. This new reduced-rank estimator is computationally efficient,
has continuous solution path and possesses better bias-variance property than
its classical counterpart. The rank consistency and prediction/estimation
performance bounds of the proposed estimator are established under
high-dimensional asymptotic regime. Simulation studies and an application in
genetics demonstrate that the proposed estimator has superior performance to
several existing methods. The adaptive nuclear-norm penalization can also serve
as a building block to study a broad class of singular value penalties. |
---|---|
DOI: | 10.48550/arxiv.1201.0381 |