Iterative reconstruction of rank-one matrices in noise

We consider the problem of estimating a rank-one matrix in Gaussian noise under a probabilistic model for the left and right factors of the matrix. The probabilistic model can impose constraints on the factors including sparsity and positivity that arise commonly in learning problems. We propose a f...

Full description

Saved in:
Bibliographic Details
Published inInformation and inference Vol. 7; no. 3; pp. 531 - 562
Main Authors Fletcher, Alyson K, Rangan, Sundeep
Format Journal Article
LanguageEnglish
Published 19.09.2018
Online AccessGet full text
ISSN2049-8764
2049-8772
DOI10.1093/imaiai/iax014

Cover

More Information
Summary:We consider the problem of estimating a rank-one matrix in Gaussian noise under a probabilistic model for the left and right factors of the matrix. The probabilistic model can impose constraints on the factors including sparsity and positivity that arise commonly in learning problems. We propose a family of algorithms that reduce the problem to a sequence of scalar estimation computations. These algorithms are similar to approximate message-passing techniques based on Gaussian approximations of loopy belief propagation that have been used recently in compressed sensing. Leveraging analysis methods by Bayati and Montanari, we show that the asymptotic behavior of the algorithm is described by a simple scalar equivalent model, where the distribution of the estimates at each iteration is identical to certain scalar estimates of the variables in Gaussian noise. Moreover, the effective Gaussian noise level is described by a set of state evolution equations. The proposed approach to deriving algorithms thus provides a computationally simple and general method for rank-one estimation problems with a precise analysis in certain high-dimensional settings.
ISSN:2049-8764
2049-8772
DOI:10.1093/imaiai/iax014