Parallel and Distributed MCMC via Shepherding Distributions

Arkabandhu Chowdhury, Christopher Jermaine
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1819-1827, 2018.

Abstract

In this paper, we present a general algorithmic framework for developing easily parallelizable/distributable Markov Chain Monte Carlo (MCMC) algorithms. Our framework relies on the introduction of an auxiliary distribution called a ’shepherding distribution’ (SD) that is used to control several MCMC chains that run in parallel. The SD is an introduced prior on one or more key parameters (or hyperparameters) of the target distribution. The shepherded chains then collectively explore the space of samples, communicating via the shepherding distribution, to reach high likelihood regions faster. The method of SDs is simple, and it is often easy to develop a shepherded sampler for a particular problem. Other advantages include wide applicability- the method can easily be used to draw samples from discrete distributions, or distributions on the simplex. Further, the method is asymptotically correct, since the method of SDs trivially maintains detailed balance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v84-chowdhury18a, title = {Parallel and Distributed MCMC via Shepherding Distributions}, author = {Chowdhury, Arkabandhu and Jermaine, Christopher}, booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics}, pages = {1819--1827}, year = {2018}, editor = {Storkey, Amos and Perez-Cruz, Fernando}, volume = {84}, series = {Proceedings of Machine Learning Research}, month = {09--11 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v84/chowdhury18a/chowdhury18a.pdf}, url = {https://proceedings.mlr.press/v84/chowdhury18a.html}, abstract = {In this paper, we present a general algorithmic framework for developing easily parallelizable/distributable Markov Chain Monte Carlo (MCMC) algorithms. Our framework relies on the introduction of an auxiliary distribution called a ’shepherding distribution’ (SD) that is used to control several MCMC chains that run in parallel. The SD is an introduced prior on one or more key parameters (or hyperparameters) of the target distribution. The shepherded chains then collectively explore the space of samples, communicating via the shepherding distribution, to reach high likelihood regions faster. The method of SDs is simple, and it is often easy to develop a shepherded sampler for a particular problem. Other advantages include wide applicability- the method can easily be used to draw samples from discrete distributions, or distributions on the simplex. Further, the method is asymptotically correct, since the method of SDs trivially maintains detailed balance.} }
Endnote
%0 Conference Paper %T Parallel and Distributed MCMC via Shepherding Distributions %A Arkabandhu Chowdhury %A Christopher Jermaine %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2018 %E Amos Storkey %E Fernando Perez-Cruz %F pmlr-v84-chowdhury18a %I PMLR %P 1819--1827 %U https://proceedings.mlr.press/v84/chowdhury18a.html %V 84 %X In this paper, we present a general algorithmic framework for developing easily parallelizable/distributable Markov Chain Monte Carlo (MCMC) algorithms. Our framework relies on the introduction of an auxiliary distribution called a ’shepherding distribution’ (SD) that is used to control several MCMC chains that run in parallel. The SD is an introduced prior on one or more key parameters (or hyperparameters) of the target distribution. The shepherded chains then collectively explore the space of samples, communicating via the shepherding distribution, to reach high likelihood regions faster. The method of SDs is simple, and it is often easy to develop a shepherded sampler for a particular problem. Other advantages include wide applicability- the method can easily be used to draw samples from discrete distributions, or distributions on the simplex. Further, the method is asymptotically correct, since the method of SDs trivially maintains detailed balance.
APA
Chowdhury, A. & Jermaine, C.. (2018). Parallel and Distributed MCMC via Shepherding Distributions. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 84:1819-1827 Available from https://proceedings.mlr.press/v84/chowdhury18a.html.

Related Material