Exact Matrix Completion via Convex Optimization
We consider a problem of considerable practical interest: the recovery of a data matrix from a sampling of its entries. Suppose that we observe m entries selected uniformly at random from a matrix M . Can we complete the matrix and recover the entries that we have not seen? We show that one can perf...
Saved in:
Published in | Foundations of computational mathematics Vol. 9; no. 6; pp. 717 - 772 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
New York
Springer-Verlag
01.12.2009
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We consider a problem of considerable practical interest: the recovery of a data matrix from a sampling of its entries. Suppose that we observe
m
entries selected uniformly at random from a matrix
M
. Can we complete the matrix and recover the entries that we have not seen?
We show that one can perfectly recover most low-rank matrices from what appears to be an incomplete set of entries. We prove that if the number
m
of sampled entries obeys
for some positive numerical constant
C
, then with very high probability, most
n
×
n
matrices of rank
r
can be perfectly recovered by solving a simple convex optimization program. This program finds the matrix with minimum nuclear norm that fits the data. The condition above assumes that the rank is not too large. However, if one replaces the 1.2 exponent with 1.25, then the result holds for all values of the rank. Similar results hold for arbitrary rectangular matrices as well. Our results are connected with the recent literature on compressed sensing, and show that objects other than signals and images can be perfectly reconstructed from very limited information. |
---|---|
Bibliography: | SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 14 ObjectType-Article-2 content type line 23 |
ISSN: | 1615-3375 1615-3383 |
DOI: | 10.1007/s10208-009-9045-5 |