Matrix completion has no spurious local minimum

Conference Paper

Matrix completion is a basic machine learning problem that has wide applications, especially in collaborative filtering and recommender systems. Simple non-convex optimization algorithms are popular and effective in practice. Despite recent progress in proving various non-convex algorithms converge from a good initial point, it remains unclear why random or arbitrary initialization suffices in practice. We prove that the commonly used non-convex objective function for positive semidefinite matrix completion has no spurious local minima - all local minima must also be global. Therefore, many popular optimization algorithms such as (stochastic) gradient descent can provably solve positive semidefinite matrix completion with arbitrary initialization in polynomial time. The result can be generalized to the setting when the observed entries contain noise. We believe that our main proof strategy can be useful for understanding geometric properties of other statistical problems involving partial or noisy observations.

Duke Authors

Cited Authors

  • Ge, R; Lee, JD; Ma, T

Published Date

  • January 1, 2016

Published In

Start / End Page

  • 2981 - 2989

International Standard Serial Number (ISSN)

  • 1049-5258

Citation Source

  • Scopus