Skip to main content

Faster Convergence of Stochastic Gradient Langevin Dynamics for Non-Log-Concave Sampling

Publication ,  Conference
Zou, D; Xu, P; Gu, Q
Published in: Proceedings of Machine Learning Research
January 1, 2021

We provide a new convergence analysis of stochastic gradient Langevin dynamics (SGLD) for sampling from a class of distributions that can be non-log-concave. At the core of our approach is a novel conductance analysis of SGLD using an auxiliary time-reversible Markov Chain. Under certain conditions on the target distribution, we prove that Oe(d4ε-2) stochastic gradient evaluations suffice to guarantee ε-sampling error in terms of the total variation distance, where d is the problem dimension. This improves existing results on the convergence rate of SGLD [Raginsky et al., 2017, Xu et al., 2018]. We further show that provided an additional Hessian Lipschitz condition on the log-density function, SGLD is guaranteed to achieve ε-sampling error within Oe(d15/4ε-3/2) stochastic gradient evaluations. Our proof technique provides a new way to study the convergence of Langevin based algorithms, and sheds some light on the design of fast stochastic gradient based sampling algorithms.

Duke Scholars

Published In

Proceedings of Machine Learning Research

EISSN

2640-3498

Publication Date

January 1, 2021

Volume

161

Start / End Page

1152 / 1162
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Zou, D., Xu, P., & Gu, Q. (2021). Faster Convergence of Stochastic Gradient Langevin Dynamics for Non-Log-Concave Sampling. In Proceedings of Machine Learning Research (Vol. 161, pp. 1152–1162).
Zou, D., P. Xu, and Q. Gu. “Faster Convergence of Stochastic Gradient Langevin Dynamics for Non-Log-Concave Sampling.” In Proceedings of Machine Learning Research, 161:1152–62, 2021.
Zou D, Xu P, Gu Q. Faster Convergence of Stochastic Gradient Langevin Dynamics for Non-Log-Concave Sampling. In: Proceedings of Machine Learning Research. 2021. p. 1152–62.
Zou, D., et al. “Faster Convergence of Stochastic Gradient Langevin Dynamics for Non-Log-Concave Sampling.” Proceedings of Machine Learning Research, vol. 161, 2021, pp. 1152–62.
Zou D, Xu P, Gu Q. Faster Convergence of Stochastic Gradient Langevin Dynamics for Non-Log-Concave Sampling. Proceedings of Machine Learning Research. 2021. p. 1152–1162.

Published In

Proceedings of Machine Learning Research

EISSN

2640-3498

Publication Date

January 1, 2021

Volume

161

Start / End Page

1152 / 1162