Skip to main content

Accelerated Zeroth-Order and First-Order Momentum Methods from Mini to Minimax Optimization

Publication ,  Journal Article
Huang, F; Gao, S; Pei, J; Huang, H
Published in: Journal of Machine Learning Research
January 1, 2022

In the paper, we propose a class of accelerated zeroth-order and first-order momentum methods for both nonconvex mini-optimization and minimax-optimization. Specifically, we propose a new accelerated zeroth-order momentum (Acc-ZOM) method for black-box mini-optimization where only function values can be obtained. Moreover, we prove that our Acc-ZOM method achieves a lower query complexity of Õ(d3/4ε−3) for finding an ε- stationary point, which improves the best known result by a factor of O(d1/4) where d denotes the variable dimension. In particular, our Acc-ZOM does not need large batches required in the existing zeroth-order stochastic algorithms. Meanwhile, we propose an accelerated zeroth-order momentum descent ascent (Acc-ZOMDA) method for black-box minimax optimization, where only function values can be obtained. Our Acc-ZOMDA obtains a low query complexity of Õ((d1 + d2)3/4κ4y5ε−3) without requiring large batches for finding an ε-stationary point, where d1 and d2 denote variable dimensions and κy is condition number. Moreover, we propose an accelerated first-order momentum descent ascent (Acc-MDA) method for minimax optimization, whose explicit gradients are accessible. Our Acc-MDA achieves a low gradient complexity of Õ(κy4.5ε−3) without requiring large batches for finding an ε-stationary point. In particular, our Acc-MDA can obtain a lower gradient complexity of Õ(κ2y5ε−3) with a batch size O(κ4y), which improves the best known result by a factor of O(κ1y/2). Extensive experimental results on black-box adversarial attack to deep neural networks and poisoning attack to logistic regression demonstrate efficiency of our algorithms.

Duke Scholars

Published In

Journal of Machine Learning Research

EISSN

1533-7928

ISSN

1532-4435

Publication Date

January 1, 2022

Volume

23

Related Subject Headings

  • Artificial Intelligence & Image Processing
  • 4905 Statistics
  • 4611 Machine learning
  • 17 Psychology and Cognitive Sciences
  • 08 Information and Computing Sciences
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Huang, F., Gao, S., Pei, J., & Huang, H. (2022). Accelerated Zeroth-Order and First-Order Momentum Methods from Mini to Minimax Optimization. Journal of Machine Learning Research, 23.
Huang, F., S. Gao, J. Pei, and H. Huang. “Accelerated Zeroth-Order and First-Order Momentum Methods from Mini to Minimax Optimization.” Journal of Machine Learning Research 23 (January 1, 2022).
Huang F, Gao S, Pei J, Huang H. Accelerated Zeroth-Order and First-Order Momentum Methods from Mini to Minimax Optimization. Journal of Machine Learning Research. 2022 Jan 1;23.
Huang, F., et al. “Accelerated Zeroth-Order and First-Order Momentum Methods from Mini to Minimax Optimization.” Journal of Machine Learning Research, vol. 23, Jan. 2022.
Huang F, Gao S, Pei J, Huang H. Accelerated Zeroth-Order and First-Order Momentum Methods from Mini to Minimax Optimization. Journal of Machine Learning Research. 2022 Jan 1;23.

Published In

Journal of Machine Learning Research

EISSN

1533-7928

ISSN

1532-4435

Publication Date

January 1, 2022

Volume

23

Related Subject Headings

  • Artificial Intelligence & Image Processing
  • 4905 Statistics
  • 4611 Machine learning
  • 17 Psychology and Cognitive Sciences
  • 08 Information and Computing Sciences