Improving Gibbs Sampler Scan Quality with DoGS

Ioannis Mitliagkas, Lester Mackey
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:2469-2477, 2017.

Abstract

The pairwise influence matrix of Dobrushin has long been used as an analytical tool to bound the rate of convergence of Gibbs sampling. In this work, we use Dobrushin influence as the basis of a practical tool to certify and efficiently improve the quality of a Gibbs sampler. Our Dobrushin-optimized Gibbs samplers (DoGS) offer customized variable selection orders for a given sampling budget and variable subset of interest, explicit bounds on total variation distance to stationarity, and certifiable improvements over the standard systematic and uniform random scan Gibbs samplers. In our experiments with image segmentation, Markov chain Monte Carlo maximum likelihood estimation, and Ising model inference, DoGS consistently deliver higher-quality inferences with significantly smaller sampling budgets than standard Gibbs samplers.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-mitliagkas17a, title = {Improving {G}ibbs Sampler Scan Quality with {D}o{GS}}, author = {Ioannis Mitliagkas and Lester Mackey}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {2469--2477}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/mitliagkas17a/mitliagkas17a.pdf}, url = {https://proceedings.mlr.press/v70/mitliagkas17a.html}, abstract = {The pairwise influence matrix of Dobrushin has long been used as an analytical tool to bound the rate of convergence of Gibbs sampling. In this work, we use Dobrushin influence as the basis of a practical tool to certify and efficiently improve the quality of a Gibbs sampler. Our Dobrushin-optimized Gibbs samplers (DoGS) offer customized variable selection orders for a given sampling budget and variable subset of interest, explicit bounds on total variation distance to stationarity, and certifiable improvements over the standard systematic and uniform random scan Gibbs samplers. In our experiments with image segmentation, Markov chain Monte Carlo maximum likelihood estimation, and Ising model inference, DoGS consistently deliver higher-quality inferences with significantly smaller sampling budgets than standard Gibbs samplers.} }
Endnote
%0 Conference Paper %T Improving Gibbs Sampler Scan Quality with DoGS %A Ioannis Mitliagkas %A Lester Mackey %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-mitliagkas17a %I PMLR %P 2469--2477 %U https://proceedings.mlr.press/v70/mitliagkas17a.html %V 70 %X The pairwise influence matrix of Dobrushin has long been used as an analytical tool to bound the rate of convergence of Gibbs sampling. In this work, we use Dobrushin influence as the basis of a practical tool to certify and efficiently improve the quality of a Gibbs sampler. Our Dobrushin-optimized Gibbs samplers (DoGS) offer customized variable selection orders for a given sampling budget and variable subset of interest, explicit bounds on total variation distance to stationarity, and certifiable improvements over the standard systematic and uniform random scan Gibbs samplers. In our experiments with image segmentation, Markov chain Monte Carlo maximum likelihood estimation, and Ising model inference, DoGS consistently deliver higher-quality inferences with significantly smaller sampling budgets than standard Gibbs samplers.
APA
Mitliagkas, I. & Mackey, L.. (2017). Improving Gibbs Sampler Scan Quality with DoGS. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:2469-2477 Available from https://proceedings.mlr.press/v70/mitliagkas17a.html.

Related Material