Skip to main content

On the global convergence of randomized coordinate gradient descent for non-convex optimization

Publication ,  Journal Article
Chen, Z; Li, Y; Lu, J
January 4, 2021

In this work, we analyze the global convergence property of coordinate gradient descent with random choice of coordinates and stepsizes for non-convex optimization problems. Under generic assumptions, we prove that the algorithm iterate will almost surely escape strict saddle points of the objective function. As a result, the algorithm is guaranteed to converge to local minima if all saddle points are strict. Our proof is based on viewing coordinate descent algorithm as a nonlinear random dynamical system and a quantitative finite block analysis of its linearization around saddle points.

Duke Scholars

Publication Date

January 4, 2021
 

Publication Date

January 4, 2021