Skip to main content

High-Order stochastic gradient thermostats for Bayesian learning of deep models

Publication ,  Conference
Li, C; Chen, C; Fan, K; Carin, L
Published in: 30th AAAI Conference on Artificial Intelligence, AAAI 2016
January 1, 2016

Learning in deep models using Bayesian methods has generated significant attention recently. This is largely because of the feasibility of modern Bayesian methods to yield scalable learning and inference, while maintaining a measure of uncertainty in the model parameters. Stochastic gradient MCMC algorithms (SG-MCMC) are a family of diffusion-based sampling methods for large-scale Bayesian learning. In SG-MCMC, multivariate stochastic gradient thermostats (mSGNHT) augment each parameter of interest, with a momentum and a thermostat variable to maintain stationary distributions as target posterior distributions. As the number of variables in a continuous-time diffusion increases, its numerical approximation error becomes a practical bottleneck, so better use of a numerical integrator is desirable. To this end, we propose use of an efficient symmetric splitting integrator in mSGNHT, instead of the traditional Euler integrator. We demonstrate that the proposed scheme is more accurate, robust, and converges faster. These properties are demonstrated to be desirable in Bayesian deep learning. Extensive experiments on two canonical models and their deep extensions demonstrate that the proposed scheme improves general Bayesian posterior sampling, particularly for deep models.

Duke Scholars

Published In

30th AAAI Conference on Artificial Intelligence, AAAI 2016

ISBN

9781577357605

Publication Date

January 1, 2016

Start / End Page

1795 / 1801
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Li, C., Chen, C., Fan, K., & Carin, L. (2016). High-Order stochastic gradient thermostats for Bayesian learning of deep models. In 30th AAAI Conference on Artificial Intelligence, AAAI 2016 (pp. 1795–1801).
Li, C., C. Chen, K. Fan, and L. Carin. “High-Order stochastic gradient thermostats for Bayesian learning of deep models.” In 30th AAAI Conference on Artificial Intelligence, AAAI 2016, 1795–1801, 2016.
Li C, Chen C, Fan K, Carin L. High-Order stochastic gradient thermostats for Bayesian learning of deep models. In: 30th AAAI Conference on Artificial Intelligence, AAAI 2016. 2016. p. 1795–801.
Li, C., et al. “High-Order stochastic gradient thermostats for Bayesian learning of deep models.” 30th AAAI Conference on Artificial Intelligence, AAAI 2016, 2016, pp. 1795–801.
Li C, Chen C, Fan K, Carin L. High-Order stochastic gradient thermostats for Bayesian learning of deep models. 30th AAAI Conference on Artificial Intelligence, AAAI 2016. 2016. p. 1795–1801.

Published In

30th AAAI Conference on Artificial Intelligence, AAAI 2016

ISBN

9781577357605

Publication Date

January 1, 2016

Start / End Page

1795 / 1801