Skip to main content

Spiderboost and momentum: Faster stochastic variance reduction algorithms

Publication ,  Journal Article
Wang, Z; Ji, K; Zhou, Y; Liang, Y; Tarokh, V
Published in: Advances in Neural Information Processing Systems
January 1, 2019

SARAH and SPIDER are two recently developed stochastic variance-reduced algorithms, and SPIDER has been shown to achieve a near-optimal first-order oracle complexity in smooth nonconvex optimization. However, SPIDER uses an accuracy-dependent stepsize that slows down the convergence in practice, and cannot handle objective functions that involve nonsmooth regularizers. In this paper, we propose SpiderBoost as an improved scheme, which allows to use a much larger constant-level stepsize while maintaining the same near-optimal oracle complexity, and can be extended with proximal mapping to handle composite optimization (which is nonsmooth and nonconvex) with provable convergence guarantee. In particular, we show that proximal SpiderBoost achieves an oracle complexity of O(Equation Presented) in composite nonconvex optimization, improving the state-of-the-art result by a factor of O(Equation Presented). We further develop a novel momentum scheme to accelerate SpiderBoost for composite optimization, which achieves the near-optimal oracle complexity in theory and substantial improvement in experiments.

Duke Scholars

Published In

Advances in Neural Information Processing Systems

ISSN

1049-5258

Publication Date

January 1, 2019

Volume

32

Related Subject Headings

  • 4611 Machine learning
  • 1702 Cognitive Sciences
  • 1701 Psychology
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Wang, Z., Ji, K., Zhou, Y., Liang, Y., & Tarokh, V. (2019). Spiderboost and momentum: Faster stochastic variance reduction algorithms. Advances in Neural Information Processing Systems, 32.
Wang, Z., K. Ji, Y. Zhou, Y. Liang, and V. Tarokh. “Spiderboost and momentum: Faster stochastic variance reduction algorithms.” Advances in Neural Information Processing Systems 32 (January 1, 2019).
Wang Z, Ji K, Zhou Y, Liang Y, Tarokh V. Spiderboost and momentum: Faster stochastic variance reduction algorithms. Advances in Neural Information Processing Systems. 2019 Jan 1;32.
Wang, Z., et al. “Spiderboost and momentum: Faster stochastic variance reduction algorithms.” Advances in Neural Information Processing Systems, vol. 32, Jan. 2019.
Wang Z, Ji K, Zhou Y, Liang Y, Tarokh V. Spiderboost and momentum: Faster stochastic variance reduction algorithms. Advances in Neural Information Processing Systems. 2019 Jan 1;32.

Published In

Advances in Neural Information Processing Systems

ISSN

1049-5258

Publication Date

January 1, 2019

Volume

32

Related Subject Headings

  • 4611 Machine learning
  • 1702 Cognitive Sciences
  • 1701 Psychology