Spherical Hamiltonian Monte Carlo for Constrained Target Distributions

Shiwei Lan, Bo Zhou, Babak Shahbaba
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):629-637, 2014.

Abstract

Statistical models with constrained probability distributions are abundant in machine learning. Some examples include regression models with norm constraints (e.g., Lasso), probit models, many copula models, and Latent Dirichlet Allocation (LDA) models. Bayesian inference involving probability distributions confined to constrained domains could be quite challenging for commonly used sampling algorithms. For such problems, we propose a novel Markov Chain Monte Carlo (MCMC) method that provides a general and computationally efficient framework for handling boundary conditions. Our method first maps the D-dimensional constrained domain of parameters to the unit ball \bf B_0^D(1), then augments it to the D-dimensional sphere \bf S^D such that the original boundary corresponds to the equator of \bf S^D. This way, our method handles the constraints implicitly by moving freely on sphere generating proposals that remain within boundaries when mapped back to the original space. To improve the computational efficiency of our algorithm, we divide the dynamics into several parts such that the resulting split dynamics has a partial analytical solution as a geodesic flow on the sphere. We apply our method to several examples including truncated Gaussian, Bayesian Lasso, Bayesian bridge regression, and a copula model for identifying synchrony among multiple neurons. Our results show that the proposed method can provide a natural and efficient framework for handling several types of constraints on target distributions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-lan14, title = {Spherical Hamiltonian Monte Carlo for Constrained Target Distributions}, author = {Lan, Shiwei and Zhou, Bo and Shahbaba, Babak}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {629--637}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/lan14.pdf}, url = {https://proceedings.mlr.press/v32/lan14.html}, abstract = {Statistical models with constrained probability distributions are abundant in machine learning. Some examples include regression models with norm constraints (e.g., Lasso), probit models, many copula models, and Latent Dirichlet Allocation (LDA) models. Bayesian inference involving probability distributions confined to constrained domains could be quite challenging for commonly used sampling algorithms. For such problems, we propose a novel Markov Chain Monte Carlo (MCMC) method that provides a general and computationally efficient framework for handling boundary conditions. Our method first maps the D-dimensional constrained domain of parameters to the unit ball \bf B_0^D(1), then augments it to the D-dimensional sphere \bf S^D such that the original boundary corresponds to the equator of \bf S^D. This way, our method handles the constraints implicitly by moving freely on sphere generating proposals that remain within boundaries when mapped back to the original space. To improve the computational efficiency of our algorithm, we divide the dynamics into several parts such that the resulting split dynamics has a partial analytical solution as a geodesic flow on the sphere. We apply our method to several examples including truncated Gaussian, Bayesian Lasso, Bayesian bridge regression, and a copula model for identifying synchrony among multiple neurons. Our results show that the proposed method can provide a natural and efficient framework for handling several types of constraints on target distributions.} }
Endnote
%0 Conference Paper %T Spherical Hamiltonian Monte Carlo for Constrained Target Distributions %A Shiwei Lan %A Bo Zhou %A Babak Shahbaba %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-lan14 %I PMLR %P 629--637 %U https://proceedings.mlr.press/v32/lan14.html %V 32 %N 1 %X Statistical models with constrained probability distributions are abundant in machine learning. Some examples include regression models with norm constraints (e.g., Lasso), probit models, many copula models, and Latent Dirichlet Allocation (LDA) models. Bayesian inference involving probability distributions confined to constrained domains could be quite challenging for commonly used sampling algorithms. For such problems, we propose a novel Markov Chain Monte Carlo (MCMC) method that provides a general and computationally efficient framework for handling boundary conditions. Our method first maps the D-dimensional constrained domain of parameters to the unit ball \bf B_0^D(1), then augments it to the D-dimensional sphere \bf S^D such that the original boundary corresponds to the equator of \bf S^D. This way, our method handles the constraints implicitly by moving freely on sphere generating proposals that remain within boundaries when mapped back to the original space. To improve the computational efficiency of our algorithm, we divide the dynamics into several parts such that the resulting split dynamics has a partial analytical solution as a geodesic flow on the sphere. We apply our method to several examples including truncated Gaussian, Bayesian Lasso, Bayesian bridge regression, and a copula model for identifying synchrony among multiple neurons. Our results show that the proposed method can provide a natural and efficient framework for handling several types of constraints on target distributions.
RIS
TY - CPAPER TI - Spherical Hamiltonian Monte Carlo for Constrained Target Distributions AU - Shiwei Lan AU - Bo Zhou AU - Babak Shahbaba BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-lan14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 1 SP - 629 EP - 637 L1 - http://proceedings.mlr.press/v32/lan14.pdf UR - https://proceedings.mlr.press/v32/lan14.html AB - Statistical models with constrained probability distributions are abundant in machine learning. Some examples include regression models with norm constraints (e.g., Lasso), probit models, many copula models, and Latent Dirichlet Allocation (LDA) models. Bayesian inference involving probability distributions confined to constrained domains could be quite challenging for commonly used sampling algorithms. For such problems, we propose a novel Markov Chain Monte Carlo (MCMC) method that provides a general and computationally efficient framework for handling boundary conditions. Our method first maps the D-dimensional constrained domain of parameters to the unit ball \bf B_0^D(1), then augments it to the D-dimensional sphere \bf S^D such that the original boundary corresponds to the equator of \bf S^D. This way, our method handles the constraints implicitly by moving freely on sphere generating proposals that remain within boundaries when mapped back to the original space. To improve the computational efficiency of our algorithm, we divide the dynamics into several parts such that the resulting split dynamics has a partial analytical solution as a geodesic flow on the sphere. We apply our method to several examples including truncated Gaussian, Bayesian Lasso, Bayesian bridge regression, and a copula model for identifying synchrony among multiple neurons. Our results show that the proposed method can provide a natural and efficient framework for handling several types of constraints on target distributions. ER -
APA
Lan, S., Zhou, B. & Shahbaba, B.. (2014). Spherical Hamiltonian Monte Carlo for Constrained Target Distributions. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(1):629-637 Available from https://proceedings.mlr.press/v32/lan14.html.

Related Material