Areas of Focus
Bridging convex and nonconvex optimization in noisy matrix completion: Stability and uncertainty quantification
This talk is concerned with noisy matrix completion: given partial and corrupted entries of a large low-rank matrix, how to estimate and infer the underlying matrix? Arguably one of the most popular paradigms to tackle this problem is convex relaxation, which achieves remarkable efficacy in practice. However, the statistical stability guarantees of this approach is still far from optimal in the noisy setting, falling short of explaining the empirical success. Moreover, it is generally very challenging to pin down the distributions of the convex solution, which presents a major roadblock in assessing the uncertainty, or “confidence”, for the obtained estimates --- a crucial task at the core of statistical inference.
Our recent work makes progress towards understanding stability and uncertainty quantification for matrix completion. (1) We demonstrate that the convex programming approach achieves near-optimal estimation errors vis-a-vis random noise; (2) we develop a de-biased estimator that admits entrywise distributional characterizations, thus enabling asymptotically optimal inference. All of this is enabled by bridging convex relaxation with the nonconvex Burer-Monteiro approach, a seemingly distinct algorithmic paradigm that is provably robust against noise.
This is based on joint work with Cong Ma, Yuling Yan, Yuejie Chi, and Jianqing Fan.
Yuxin Chen is currently an assistant professor in the Department of Electrical Engineering at Princeton University. Prior to joining Princeton, he was a postdoctoral scholar in the Department of Statistics at Stanford University, and he completed his Ph.D. in Electrical Engineering at Stanford University. His research interests include high-dimensional statistics, convex and nonconvex optimization, statistical learning, and information theory. He received the 2019 AFOSR Young Investigator Award.