
Accelerated algorithms for convex and non-convex optimization on manifolds
We propose a general scheme for solving convex and non-convex optimization problems on manifolds. The central idea is that, by adding a multiple of the squared retraction distance to the objective function in question, we “convexify” the objective function and solve a series of convex sub-problems in the optimization procedure. Our proposed algorithm adapts to the level of complexity in the objective function without requiring the knowledge of the convexity of non-convexity of the objective function. We show that when the objective function is convex, the algorithm provably converges to the optimum and leads to accelerated convergence. When the objective function is non-convex, the algorithm will converge to a stationary point. Our proposed method unifies insights from Nesterov’s original idea for accelerating gradient descent algorithms with recent developments in optimization algorithms in Euclidean space. We demonstrate the utility of our algorithms on several manifold optimization tasks such as estimating intrinsic and extrinsic Fréchet means on spheres and low-rank matrix factorization with Grassmann manifolds applied to the Netflix rating data set.
Duke Scholars
Altmetric Attention Stats
Dimensions Citation Stats
Published In
DOI
EISSN
ISSN
Publication Date
Volume
Issue
Related Subject Headings
- Artificial Intelligence & Image Processing
- 4611 Machine learning
- 1702 Cognitive Sciences
- 0806 Information Systems
- 0801 Artificial Intelligence and Image Processing
Citation

Published In
DOI
EISSN
ISSN
Publication Date
Volume
Issue
Related Subject Headings
- Artificial Intelligence & Image Processing
- 4611 Machine learning
- 1702 Cognitive Sciences
- 0806 Information Systems
- 0801 Artificial Intelligence and Image Processing