Skip to main content

Towards Faster Training Algorithms Exploiting Bandit Sampling From Convex to Strongly Convex Conditions

Publication ,  Journal Article
Zhou, Y; Huang, K; Cheng, C; Wang, X; Hussain, A; Liu, X
Published in: IEEE Transactions on Emerging Topics in Computational Intelligence
April 1, 2023

The training process for deep learning and pattern recognition normally involves the use of convex and strongly convex optimization algorithms such as AdaBelief and SAdam to handle lots of 'uninformative' samples that should be ignored, thus incurring extra calculations. To solve this open problem, we propose to design bandit sampling method to make these algorithms focus on 'informative' samples during training process. Our contribution is twofold: first, we propose a convex optimization algorithm with bandit sampling, termed AdaBeliefBS, and prove that it converges faster than its original version; second, we prove that bandit sampling works well for strongly convex algorithms, and propose a generalized SAdam, called SAdamBS, that converges faster than SAdam. Finally, we conduct a series of experiments on various benchmark datasets to verify the fast convergence rate of our proposed algorithms.

Duke Scholars

Published In

IEEE Transactions on Emerging Topics in Computational Intelligence

DOI

EISSN

2471-285X

Publication Date

April 1, 2023

Volume

7

Issue

2

Start / End Page

565 / 577

Related Subject Headings

  • 4611 Machine learning
  • 4603 Computer vision and multimedia computation
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Zhou, Y., Huang, K., Cheng, C., Wang, X., Hussain, A., & Liu, X. (2023). Towards Faster Training Algorithms Exploiting Bandit Sampling From Convex to Strongly Convex Conditions. IEEE Transactions on Emerging Topics in Computational Intelligence, 7(2), 565–577. https://doi.org/10.1109/TETCI.2022.3171797
Zhou, Y., K. Huang, C. Cheng, X. Wang, A. Hussain, and X. Liu. “Towards Faster Training Algorithms Exploiting Bandit Sampling From Convex to Strongly Convex Conditions.” IEEE Transactions on Emerging Topics in Computational Intelligence 7, no. 2 (April 1, 2023): 565–77. https://doi.org/10.1109/TETCI.2022.3171797.
Zhou Y, Huang K, Cheng C, Wang X, Hussain A, Liu X. Towards Faster Training Algorithms Exploiting Bandit Sampling From Convex to Strongly Convex Conditions. IEEE Transactions on Emerging Topics in Computational Intelligence. 2023 Apr 1;7(2):565–77.
Zhou, Y., et al. “Towards Faster Training Algorithms Exploiting Bandit Sampling From Convex to Strongly Convex Conditions.” IEEE Transactions on Emerging Topics in Computational Intelligence, vol. 7, no. 2, Apr. 2023, pp. 565–77. Scopus, doi:10.1109/TETCI.2022.3171797.
Zhou Y, Huang K, Cheng C, Wang X, Hussain A, Liu X. Towards Faster Training Algorithms Exploiting Bandit Sampling From Convex to Strongly Convex Conditions. IEEE Transactions on Emerging Topics in Computational Intelligence. 2023 Apr 1;7(2):565–577.

Published In

IEEE Transactions on Emerging Topics in Computational Intelligence

DOI

EISSN

2471-285X

Publication Date

April 1, 2023

Volume

7

Issue

2

Start / End Page

565 / 577

Related Subject Headings

  • 4611 Machine learning
  • 4603 Computer vision and multimedia computation