Parallel and Distributed MCMC via Shepherding Distributions

[edit]

Arkabandhu Chowdhury, Christopher Jermaine ;
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1819-1827, 2018.

Abstract

In this paper, we present a general algorithmic framework for developing easily parallelizable/distributable Markov Chain Monte Carlo (MCMC) algorithms. Our framework relies on the introduction of an auxiliary distribution called a ’shepherding distribution’ (SD) that is used to control several MCMC chains that run in parallel. The SD is an introduced prior on one or more key parameters (or hyperparameters) of the target distribution. The shepherded chains then collectively explore the space of samples, communicating via the shepherding distribution, to reach high likelihood regions faster. The method of SDs is simple, and it is often easy to develop a shepherded sampler for a particular problem. Other advantages include wide applicability- the method can easily be used to draw samples from discrete distributions, or distributions on the simplex. Further, the method is asymptotically correct, since the method of SDs trivially maintains detailed balance.

Related Material