Skip to main content
Journal cover image

A convergence analysis for a class of practical variance-reduction stochastic gradient MCMC

Publication ,  Journal Article
Chen, C; Wang, W; Zhang, Y; Su, Q; Carin, L
Published in: Science China Information Sciences
January 1, 2019

Stochastic gradient Markov chain Monte Carlo (SG-MCMC) has been developed as a flexible family of scalable Bayesian sampling algorithms. However, there has been little theoretical analysis of the impact of minibatch size to the algorithm’s convergence rate. In this paper, we prove that at the beginning of an SG-MCMC algorithm, i.e., under limited computational budget/time, a larger minibatch size leads to a faster decrease of the mean squared error bound. The reason for this is due to the prominent noise in small minibatches when calculating stochastic gradients, motivating the necessity of variance reduction in SG-MCMC for practical use. By borrowing ideas from stochastic optimization, we propose a simple and practical variance-reduction technique for SG-MCMC, that is efficient in both computation and storage. More importantly, we develop the theory to prove that our algorithm induces a faster convergence rate than standard SG-MCMC. A number of large-scale experiments, ranging from Bayesian learning of logistic regression to deep neural networks, validate the theory and demonstrate the superiority of the proposed variance-reduction SG-MCMC framework.

Duke Scholars

Published In

Science China Information Sciences

DOI

EISSN

1869-1919

ISSN

1674-733X

Publication Date

January 1, 2019

Volume

62

Issue

1

Related Subject Headings

  • Software Engineering
  • 4009 Electronics, sensors and digital hardware
  • 4007 Control engineering, mechatronics and robotics
  • 4006 Communications engineering
  • 0899 Other Information and Computing Sciences
  • 0806 Information Systems
  • 0804 Data Format
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Chen, C., Wang, W., Zhang, Y., Su, Q., & Carin, L. (2019). A convergence analysis for a class of practical variance-reduction stochastic gradient MCMC. Science China Information Sciences, 62(1). https://doi.org/10.1007/s11432-018-9656-y
Chen, C., W. Wang, Y. Zhang, Q. Su, and L. Carin. “A convergence analysis for a class of practical variance-reduction stochastic gradient MCMC.” Science China Information Sciences 62, no. 1 (January 1, 2019). https://doi.org/10.1007/s11432-018-9656-y.
Chen C, Wang W, Zhang Y, Su Q, Carin L. A convergence analysis for a class of practical variance-reduction stochastic gradient MCMC. Science China Information Sciences. 2019 Jan 1;62(1).
Chen, C., et al. “A convergence analysis for a class of practical variance-reduction stochastic gradient MCMC.” Science China Information Sciences, vol. 62, no. 1, Jan. 2019. Scopus, doi:10.1007/s11432-018-9656-y.
Chen C, Wang W, Zhang Y, Su Q, Carin L. A convergence analysis for a class of practical variance-reduction stochastic gradient MCMC. Science China Information Sciences. 2019 Jan 1;62(1).
Journal cover image

Published In

Science China Information Sciences

DOI

EISSN

1869-1919

ISSN

1674-733X

Publication Date

January 1, 2019

Volume

62

Issue

1

Related Subject Headings

  • Software Engineering
  • 4009 Electronics, sensors and digital hardware
  • 4007 Control engineering, mechatronics and robotics
  • 4006 Communications engineering
  • 0899 Other Information and Computing Sciences
  • 0806 Information Systems
  • 0804 Data Format