Skip to main content

Proximal gradient algorithm with momentum and flexible parameter restart for nonconvex optimization

Publication ,  Journal Article
Zhou, Y; Wang, Z; Ji, K; Liang, Y; Tarokh, V
Published in: IJCAI International Joint Conference on Artificial Intelligence
January 1, 2020

Various types of parameter restart schemes have been proposed for proximal gradient algorithm with momentum to facilitate their convergence in convex optimization. However, under parameter restart, the convergence of proximal gradient algorithm with momentum remains obscure in nonconvex optimization. In this paper, we propose a novel proximal gradient algorithm with momentum and parameter restart for solving nonconvex and nonsmooth problems. Our algorithm is designed to 1) allow for adopting flexible parameter restart schemes that cover many existing ones; 2) have a global sub-linear convergence rate in nonconvex and nonsmooth optimization; and 3) have guaranteed convergence to a critical point and have various types of asymptotic convergence rates depending on the parameterization of local geometry in nonconvex and nonsmooth optimization. Numerical experiments demonstrate the convergence and effectiveness of our proposed algorithm.

Duke Scholars

Published In

IJCAI International Joint Conference on Artificial Intelligence

ISSN

1045-0823

Publication Date

January 1, 2020

Volume

2021-January

Start / End Page

1445 / 1451
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Zhou, Y., Wang, Z., Ji, K., Liang, Y., & Tarokh, V. (2020). Proximal gradient algorithm with momentum and flexible parameter restart for nonconvex optimization. IJCAI International Joint Conference on Artificial Intelligence, 2021-January, 1445–1451.
Zhou, Y., Z. Wang, K. Ji, Y. Liang, and V. Tarokh. “Proximal gradient algorithm with momentum and flexible parameter restart for nonconvex optimization.” IJCAI International Joint Conference on Artificial Intelligence 2021-January (January 1, 2020): 1445–51.
Zhou Y, Wang Z, Ji K, Liang Y, Tarokh V. Proximal gradient algorithm with momentum and flexible parameter restart for nonconvex optimization. IJCAI International Joint Conference on Artificial Intelligence. 2020 Jan 1;2021-January:1445–51.
Zhou, Y., et al. “Proximal gradient algorithm with momentum and flexible parameter restart for nonconvex optimization.” IJCAI International Joint Conference on Artificial Intelligence, vol. 2021-January, Jan. 2020, pp. 1445–51.
Zhou Y, Wang Z, Ji K, Liang Y, Tarokh V. Proximal gradient algorithm with momentum and flexible parameter restart for nonconvex optimization. IJCAI International Joint Conference on Artificial Intelligence. 2020 Jan 1;2021-January:1445–1451.

Published In

IJCAI International Joint Conference on Artificial Intelligence

ISSN

1045-0823

Publication Date

January 1, 2020

Volume

2021-January

Start / End Page

1445 / 1451