Skip to main content

Parallelizing MCMC with random partition trees

Publication ,  Conference
Wang, X; Guo, F; Heller, KA; Dunson, DB
Published in: Advances in Neural Information Processing Systems
January 1, 2015

The modern scale of data has brought new challenges to Bayesian inference. In particular, conventional MCMC algorithms are computationally very expensive for large data sets. A promising approach to solve this problem is embarrassingly parallel MCMC (EP-MCMC), which first partitions the data into multiple subsets and runs independent sampling algorithms on each subset. The subset posterior draws are then aggregated via some combining rules to obtain the final approximation. Existing EP-MCMC algorithms are limited by approximation accuracy and difficulty in resampling. In this article, we propose a new EP-MCMC algorithm PART that solves these problems. The new algorithm applies random partition trees to combine the subset posterior draws, which is distribution-free, easy to resample from and can adapt to multiple scales. We provide theoretical justification and extensive experiments illustrating empirical performance.

Duke Scholars

Published In

Advances in Neural Information Processing Systems

ISSN

1049-5258

Publication Date

January 1, 2015

Volume

2015-January

Start / End Page

451 / 459

Related Subject Headings

  • 4611 Machine learning
  • 1702 Cognitive Sciences
  • 1701 Psychology
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Wang, X., Guo, F., Heller, K. A., & Dunson, D. B. (2015). Parallelizing MCMC with random partition trees. In Advances in Neural Information Processing Systems (Vol. 2015-January, pp. 451–459).
Wang, X., F. Guo, K. A. Heller, and D. B. Dunson. “Parallelizing MCMC with random partition trees.” In Advances in Neural Information Processing Systems, 2015-January:451–59, 2015.
Wang X, Guo F, Heller KA, Dunson DB. Parallelizing MCMC with random partition trees. In: Advances in Neural Information Processing Systems. 2015. p. 451–9.
Wang, X., et al. “Parallelizing MCMC with random partition trees.” Advances in Neural Information Processing Systems, vol. 2015-January, 2015, pp. 451–59.
Wang X, Guo F, Heller KA, Dunson DB. Parallelizing MCMC with random partition trees. Advances in Neural Information Processing Systems. 2015. p. 451–459.

Published In

Advances in Neural Information Processing Systems

ISSN

1049-5258

Publication Date

January 1, 2015

Volume

2015-January

Start / End Page

451 / 459

Related Subject Headings

  • 4611 Machine learning
  • 1702 Cognitive Sciences
  • 1701 Psychology