Parallel Markov Chain Monte Carlo via Spectral Clustering

Guillaume Basse, Aaron Smith, Natesh Pillai
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:1318-1327, 2016.

Abstract

As it has become common to use many computer cores in routine applications, finding good ways to parallelize popular algorithms has become increasingly important. In this paper, we present a parallelization scheme for Markov chain Monte Carlo (MCMC) methods based on spectral clustering of the underlying state space, generalizing earlier work on parallelization of MCMC methods by state space partitioning. We show empirically that this approach speeds up MCMC sampling for multimodal distributions and that it can be usefully applied in greater generality than several related algorithms. Our algorithm converges under reasonable conditions to an ‘optimal’ MCMC algorithm. We also show that our approach can be asymptotically far more efficient than naive parallelization, even in situations such as completely flat target distributions where no unique optimal algorithm exists. Finally, we combine theoretical and empirical bounds to provide practical guidance on the choice of tuning parameters.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-basse16a, title = {Parallel Markov Chain Monte Carlo via Spectral Clustering}, author = {Basse, Guillaume and Smith, Aaron and Pillai, Natesh}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {1318--1327}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/basse16a.pdf}, url = {https://proceedings.mlr.press/v51/basse16a.html}, abstract = {As it has become common to use many computer cores in routine applications, finding good ways to parallelize popular algorithms has become increasingly important. In this paper, we present a parallelization scheme for Markov chain Monte Carlo (MCMC) methods based on spectral clustering of the underlying state space, generalizing earlier work on parallelization of MCMC methods by state space partitioning. We show empirically that this approach speeds up MCMC sampling for multimodal distributions and that it can be usefully applied in greater generality than several related algorithms. Our algorithm converges under reasonable conditions to an ‘optimal’ MCMC algorithm. We also show that our approach can be asymptotically far more efficient than naive parallelization, even in situations such as completely flat target distributions where no unique optimal algorithm exists. Finally, we combine theoretical and empirical bounds to provide practical guidance on the choice of tuning parameters.} }
Endnote
%0 Conference Paper %T Parallel Markov Chain Monte Carlo via Spectral Clustering %A Guillaume Basse %A Aaron Smith %A Natesh Pillai %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-basse16a %I PMLR %P 1318--1327 %U https://proceedings.mlr.press/v51/basse16a.html %V 51 %X As it has become common to use many computer cores in routine applications, finding good ways to parallelize popular algorithms has become increasingly important. In this paper, we present a parallelization scheme for Markov chain Monte Carlo (MCMC) methods based on spectral clustering of the underlying state space, generalizing earlier work on parallelization of MCMC methods by state space partitioning. We show empirically that this approach speeds up MCMC sampling for multimodal distributions and that it can be usefully applied in greater generality than several related algorithms. Our algorithm converges under reasonable conditions to an ‘optimal’ MCMC algorithm. We also show that our approach can be asymptotically far more efficient than naive parallelization, even in situations such as completely flat target distributions where no unique optimal algorithm exists. Finally, we combine theoretical and empirical bounds to provide practical guidance on the choice of tuning parameters.
RIS
TY - CPAPER TI - Parallel Markov Chain Monte Carlo via Spectral Clustering AU - Guillaume Basse AU - Aaron Smith AU - Natesh Pillai BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-basse16a PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 1318 EP - 1327 L1 - http://proceedings.mlr.press/v51/basse16a.pdf UR - https://proceedings.mlr.press/v51/basse16a.html AB - As it has become common to use many computer cores in routine applications, finding good ways to parallelize popular algorithms has become increasingly important. In this paper, we present a parallelization scheme for Markov chain Monte Carlo (MCMC) methods based on spectral clustering of the underlying state space, generalizing earlier work on parallelization of MCMC methods by state space partitioning. We show empirically that this approach speeds up MCMC sampling for multimodal distributions and that it can be usefully applied in greater generality than several related algorithms. Our algorithm converges under reasonable conditions to an ‘optimal’ MCMC algorithm. We also show that our approach can be asymptotically far more efficient than naive parallelization, even in situations such as completely flat target distributions where no unique optimal algorithm exists. Finally, we combine theoretical and empirical bounds to provide practical guidance on the choice of tuning parameters. ER -
APA
Basse, G., Smith, A. & Pillai, N.. (2016). Parallel Markov Chain Monte Carlo via Spectral Clustering. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:1318-1327 Available from https://proceedings.mlr.press/v51/basse16a.html.

Related Material