Automatically marginalized MCMC in probabilistic programming

Jinlin Lai, Javier Burroni, Hui Guan, Daniel Sheldon
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:18301-18318, 2023.

Abstract

Hamiltonian Monte Carlo (HMC) is a powerful algorithm to sample latent variables from Bayesian models. The advent of probabilistic programming languages (PPLs) frees users from writing inference algorithms and lets users focus on modeling. However, many models are difficult for HMC to solve directly, and often require tricks like model reparameterization. We are motivated by the fact that many of those models could be simplified by marginalization. We propose to use automatic marginalization as part of the sampling process using HMC in a graphical model extracted from a PPL, which substantially improves sampling from real-world hierarchical models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-lai23a, title = {Automatically marginalized {MCMC} in probabilistic programming}, author = {Lai, Jinlin and Burroni, Javier and Guan, Hui and Sheldon, Daniel}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {18301--18318}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/lai23a/lai23a.pdf}, url = {https://proceedings.mlr.press/v202/lai23a.html}, abstract = {Hamiltonian Monte Carlo (HMC) is a powerful algorithm to sample latent variables from Bayesian models. The advent of probabilistic programming languages (PPLs) frees users from writing inference algorithms and lets users focus on modeling. However, many models are difficult for HMC to solve directly, and often require tricks like model reparameterization. We are motivated by the fact that many of those models could be simplified by marginalization. We propose to use automatic marginalization as part of the sampling process using HMC in a graphical model extracted from a PPL, which substantially improves sampling from real-world hierarchical models.} }
Endnote
%0 Conference Paper %T Automatically marginalized MCMC in probabilistic programming %A Jinlin Lai %A Javier Burroni %A Hui Guan %A Daniel Sheldon %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-lai23a %I PMLR %P 18301--18318 %U https://proceedings.mlr.press/v202/lai23a.html %V 202 %X Hamiltonian Monte Carlo (HMC) is a powerful algorithm to sample latent variables from Bayesian models. The advent of probabilistic programming languages (PPLs) frees users from writing inference algorithms and lets users focus on modeling. However, many models are difficult for HMC to solve directly, and often require tricks like model reparameterization. We are motivated by the fact that many of those models could be simplified by marginalization. We propose to use automatic marginalization as part of the sampling process using HMC in a graphical model extracted from a PPL, which substantially improves sampling from real-world hierarchical models.
APA
Lai, J., Burroni, J., Guan, H. & Sheldon, D.. (2023). Automatically marginalized MCMC in probabilistic programming. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:18301-18318 Available from https://proceedings.mlr.press/v202/lai23a.html.

Related Material