Skip to main content

Beyond log-concavity: Provable guarantees for sampling multi-modal distributions using simulated tempering langevin Monte Carlo

Publication ,  Conference
Ge, R; Lee, H; Risteski, A
Published in: Advances in Neural Information Processing Systems
January 1, 2018

A key task in Bayesian machine learning is sampling from distributions that are only specified up to a partition function (i.e., constant of proportionality). One prevalent example of this is sampling posteriors in parametric distributions, such as latent-variable generative models. However sampling (even very approximately) can be #P-hard. Classical results (going back to [BÉ85]) on sampling focus on log-concave distributions, and show a natural Markov chain called Langevin diffusion mixes in polynomial time. However, all log-concave distributions are uni-modal, while in practice it is very common for the distribution of interest to have multiple modes. In this case, Langevin diffusion suffers from torpid mixing. We address this problem by combining Langevin diffusion with simulated tempering. The result is a Markov chain that mixes more rapidly by transitioning between different temperatures of the distribution. We analyze this Markov chain for a mixture of (strongly) log-concave distributions of the same shape. In particular, our technique applies to the canonical multi-modal distribution: a mixture of gaussians (of equal variance). Our algorithm efficiently samples from these distributions given only access to the gradient of the log-pdf. To the best of our knowledge, this is the first result that proves fast mixing for multimodal distributions in this setting. For the analysis, we introduce novel techniques for proving spectral gaps based on decomposing the action of the generator of the diffusion. Previous approaches rely on decomposing the state space as a partition of sets, while our approach can be thought of as decomposing the stationary measure as a mixture of distributions (a “soft partition”). Additional materials for the paper can be found at http://tiny.cc/glr17. Note that the proof and results have been improved and generalized from the precursor at http://www.arxiv.org/abs/1710.02736. See Section for a comparison.

Duke Scholars

Published In

Advances in Neural Information Processing Systems

ISSN

1049-5258

Publication Date

January 1, 2018

Volume

2018-December

Start / End Page

7847 / 7856

Related Subject Headings

  • 4611 Machine learning
  • 1702 Cognitive Sciences
  • 1701 Psychology
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Ge, R., Lee, H., & Risteski, A. (2018). Beyond log-concavity: Provable guarantees for sampling multi-modal distributions using simulated tempering langevin Monte Carlo. In Advances in Neural Information Processing Systems (Vol. 2018-December, pp. 7847–7856).
Ge, R., H. Lee, and A. Risteski. “Beyond log-concavity: Provable guarantees for sampling multi-modal distributions using simulated tempering langevin Monte Carlo.” In Advances in Neural Information Processing Systems, 2018-December:7847–56, 2018.
Ge R, Lee H, Risteski A. Beyond log-concavity: Provable guarantees for sampling multi-modal distributions using simulated tempering langevin Monte Carlo. In: Advances in Neural Information Processing Systems. 2018. p. 7847–56.
Ge, R., et al. “Beyond log-concavity: Provable guarantees for sampling multi-modal distributions using simulated tempering langevin Monte Carlo.” Advances in Neural Information Processing Systems, vol. 2018-December, 2018, pp. 7847–56.
Ge R, Lee H, Risteski A. Beyond log-concavity: Provable guarantees for sampling multi-modal distributions using simulated tempering langevin Monte Carlo. Advances in Neural Information Processing Systems. 2018. p. 7847–7856.

Published In

Advances in Neural Information Processing Systems

ISSN

1049-5258

Publication Date

January 1, 2018

Volume

2018-December

Start / End Page

7847 / 7856

Related Subject Headings

  • 4611 Machine learning
  • 1702 Cognitive Sciences
  • 1701 Psychology