» Articles » PMID: 34305368

NOISY MATRIX COMPLETION: UNDERSTANDING STATISTICAL GUARANTEES FOR CONVEX RELAXATION VIA NONCONVEX OPTIMIZATION

Overview
Journal SIAM J Optim
Date 2021 Jul 26
PMID 34305368
Citations 9
Authors
Affiliations
Soon will be listed here.
Abstract

This paper studies noisy low-rank matrix completion: given partial and noisy entries of a large low-rank matrix, the goal is to estimate the underlying matrix faithfully and efficiently. Arguably one of the most popular paradigms to tackle this problem is convex relaxation, which achieves remarkable efficacy in practice. However, the theoretical support of this approach is still far from optimal in the noisy setting, falling short of explaining its empirical success. We make progress towards demystifying the practical efficacy of convex relaxation vis-à-vis random noise. When the rank and the condition number of the unknown matrix are bounded by a constant, we demonstrate that the convex programming approach achieves near-optimal estimation errors - in terms of the Euclidean loss, the entrywise loss, and the spectral norm loss - for a wide range of noise levels. All of this is enabled by bridging convex relaxation with the nonconvex Burer-Monteiro approach, a seemingly distinct algorithmic paradigm that is provably robust against noise. More specifically, we show that an approximate critical point of the nonconvex formulation serves as an extremely tight approximation of the convex solution, thus allowing us to transfer the desired statistical guarantees of the nonconvex approach to its convex counterpart.

Citing Articles

GNMR: A Provable One-Line Algorithm for Low Rank Matrix Recovery.

Zilber P, Nadler B SIAM J Math Data Sci. 2025; 4(2):909-934.

PMID: 39896132 PMC: 11784930. DOI: 10.1137/21m1433812.


Approximate message passing from random initialization with applications to synchronization.

Li G, Fan W, Wei Y Proc Natl Acad Sci U S A. 2023; 120(31):e2302930120.

PMID: 37490538 PMC: 10401009. DOI: 10.1073/pnas.2302930120.


Convex and Nonconvex Optimization Are Both Minimax-Optimal for Noisy Blind Deconvolution under Random Designs.

Chen Y, Fan J, Wang B, Yan Y J Am Stat Assoc. 2023; 118(542):858-868.

PMID: 37313368 PMC: 10259835. DOI: 10.1080/01621459.2021.1956501.


Sample-Efficient Reinforcement Learning for Linearly-Parameterized MDPs with a Generative Model.

Wang B, Yan Y, Fan J Adv Neural Inf Process Syst. 2022; 34:16671-16685.

PMID: 36168331 PMC: 9512142.


BRIDGING CONVEX AND NONCONVEX OPTIMIZATION IN ROBUST PCA: NOISE, OUTLIERS, AND MISSING DATA.

Chen Y, Fan J, Ma C, Yan Y Ann Stat. 2022; 49(5):2948-2971.

PMID: 36148268 PMC: 9491514. DOI: 10.1214/21-aos2066.


References
1.
Zhao T, Wang Z, Liu H . A Nonconvex Optimization Framework for Low Rank Matrix Estimation. Adv Neural Inf Process Syst. 2017; 28:559-567. PMC: 5354472. View

2.
Zhang T, Pauly J, Levesque I . Accelerating parameter mapping with a locally low rank constraint. Magn Reson Med. 2014; 73(2):655-61. PMC: 4122652. DOI: 10.1002/mrm.25161. View

3.
Chen Y, Fan J, Ma C, Yan Y . Inference and uncertainty quantification for noisy matrix completion. Proc Natl Acad Sci U S A. 2019; 116(46):22931-22937. PMC: 6859358. DOI: 10.1073/pnas.1910053116. View

4.
Fan J, Han X, Gu W . Estimating False Discovery Proportion Under Arbitrary Covariance Dependence. J Am Stat Assoc. 2014; 107(499):1019-1035. PMC: 3983872. DOI: 10.1080/01621459.2012.720478. View

5.
Fan J, Xue L, Yao J . Sufficient Forecasting Using Factor Models. J Econom. 2018; 201(2):292-306. PMC: 5931744. DOI: 10.1016/j.jeconom.2017.08.009. View