Stochastic Gradient Monomial Gamma Sampler

Yizhe Zhang, Changyou Chen, Zhe Gan, Ricardo Henao, Lawrence Carin
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:3996-4005, 2017.

Abstract

Scaling Markov Chain Monte Carlo (MCMC) to estimate posterior distributions from large datasets has been made possible as a result of advances in stochastic gradient techniques. Despite their success, mixing performance of existing methods when sampling from multimodal distributions can be less efficient with insufficient Monte Carlo samples; this is evidenced by slow convergence and insufficient exploration of posterior distributions. We propose a generalized framework to improve the sampling efficiency of stochastic gradient MCMC, by leveraging a generalized kinetics that delivers superior stationary mixing, especially in multimodal distributions, and propose several techniques to overcome the practical issues. We show that the proposed approach is better at exploring a complicated multimodal posterior distribution, and demonstrate improvements over other stochastic gradient MCMC methods on various applications.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-zhang17a, title = {Stochastic Gradient Monomial Gamma Sampler}, author = {Yizhe Zhang and Changyou Chen and Zhe Gan and Ricardo Henao and Lawrence Carin}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {3996--4005}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/zhang17a/zhang17a.pdf}, url = {https://proceedings.mlr.press/v70/zhang17a.html}, abstract = {Scaling Markov Chain Monte Carlo (MCMC) to estimate posterior distributions from large datasets has been made possible as a result of advances in stochastic gradient techniques. Despite their success, mixing performance of existing methods when sampling from multimodal distributions can be less efficient with insufficient Monte Carlo samples; this is evidenced by slow convergence and insufficient exploration of posterior distributions. We propose a generalized framework to improve the sampling efficiency of stochastic gradient MCMC, by leveraging a generalized kinetics that delivers superior stationary mixing, especially in multimodal distributions, and propose several techniques to overcome the practical issues. We show that the proposed approach is better at exploring a complicated multimodal posterior distribution, and demonstrate improvements over other stochastic gradient MCMC methods on various applications.} }
Endnote
%0 Conference Paper %T Stochastic Gradient Monomial Gamma Sampler %A Yizhe Zhang %A Changyou Chen %A Zhe Gan %A Ricardo Henao %A Lawrence Carin %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-zhang17a %I PMLR %P 3996--4005 %U https://proceedings.mlr.press/v70/zhang17a.html %V 70 %X Scaling Markov Chain Monte Carlo (MCMC) to estimate posterior distributions from large datasets has been made possible as a result of advances in stochastic gradient techniques. Despite their success, mixing performance of existing methods when sampling from multimodal distributions can be less efficient with insufficient Monte Carlo samples; this is evidenced by slow convergence and insufficient exploration of posterior distributions. We propose a generalized framework to improve the sampling efficiency of stochastic gradient MCMC, by leveraging a generalized kinetics that delivers superior stationary mixing, especially in multimodal distributions, and propose several techniques to overcome the practical issues. We show that the proposed approach is better at exploring a complicated multimodal posterior distribution, and demonstrate improvements over other stochastic gradient MCMC methods on various applications.
APA
Zhang, Y., Chen, C., Gan, Z., Henao, R. & Carin, L.. (2017). Stochastic Gradient Monomial Gamma Sampler. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:3996-4005 Available from https://proceedings.mlr.press/v70/zhang17a.html.

Related Material