Skip to main content

On the optimization landscape of tensor decompositions

Publication ,  Conference
Ge, R; Ma, T
Published in: Advances in Neural Information Processing Systems
January 1, 2017

Non-convex optimization with local search heuristics has been widely used in machine learning, achieving many state-of-art results. It becomes increasingly important to understand why they can work for these NP-hard problems on typical data. The landscape of many objective functions in learning has been conjectured to have the geometric property that "all local optima are (approximately) global optima", and thus they can be solved efficiently by local search algorithms. However, establishing such property can be very difficult. In this paper, we analyze the optimization landscape of the random over-complete tensor decomposition problem, which has many applications in unsupervised leaning, especially in learning latent variable models. In practice, it can be efficiently solved by gradient ascent on a non-convex objective. We show that for any small constant ϵ > 0, among the set of points with function values (1 + ϵ)-factor larger than the expectation of the function, all the local maxima are approximate global maxima. Previously, the best-known result only characterizes the geometry in small neighborhoods around the true components. Our result implies that even with an initialization that is barely better than the random guess, the gradient ascent algorithm is guaranteed to solve this problem. Our main technique uses Kac-Rice formula and random matrix theory. To our best knowledge, this is the first time when Kac-Rice formula is successfully applied to counting the number of local optima of a highly-structured random polynomial with dependent coefficients.

Duke Scholars

Published In

Advances in Neural Information Processing Systems

ISSN

1049-5258

Publication Date

January 1, 2017

Volume

2017-December

Start / End Page

3654 / 3664

Related Subject Headings

  • 4611 Machine learning
  • 1702 Cognitive Sciences
  • 1701 Psychology
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Ge, R., & Ma, T. (2017). On the optimization landscape of tensor decompositions. In Advances in Neural Information Processing Systems (Vol. 2017-December, pp. 3654–3664).
Ge, R., and T. Ma. “On the optimization landscape of tensor decompositions.” In Advances in Neural Information Processing Systems, 2017-December:3654–64, 2017.
Ge R, Ma T. On the optimization landscape of tensor decompositions. In: Advances in Neural Information Processing Systems. 2017. p. 3654–64.
Ge, R., and T. Ma. “On the optimization landscape of tensor decompositions.” Advances in Neural Information Processing Systems, vol. 2017-December, 2017, pp. 3654–64.
Ge R, Ma T. On the optimization landscape of tensor decompositions. Advances in Neural Information Processing Systems. 2017. p. 3654–3664.

Published In

Advances in Neural Information Processing Systems

ISSN

1049-5258

Publication Date

January 1, 2017

Volume

2017-December

Start / End Page

3654 / 3664

Related Subject Headings

  • 4611 Machine learning
  • 1702 Cognitive Sciences
  • 1701 Psychology