No spurious local minima in nonconvex low rank problems: A unified geometric analysis

Conference Paper

In this paper we develop a new framework that captures the common landscape underlying the common non-convex low-rank matrix problems including matrix sensing, matrix completion and robust PCA. In particular, we show for all above problems (including asymmetric cases): 1) ail local minima are also globally optimal; 2) no high-order saddle points exists. These results explain why simple algorithms such as stochastic gradient descent have global converge, and efficiently optimize these non-convex objective functions in practice. Our framework connects and simplifies the existing analyses on optimization landscapes for matrix sensing and symmetric matrix completion. The framework naturally leads to new results for asymmetric matrix completion and robust PCA.

Duke Authors

Cited Authors

  • Ge, R; Jin, C; Zheng, Y

Published Date

  • January 1, 2017

Published In

  • 34th International Conference on Machine Learning, Icml 2017

Volume / Issue

  • 3 /

Start / End Page

  • 1990 - 2028

International Standard Book Number 13 (ISBN-13)

  • 9781510855144

Citation Source

  • Scopus