Discrete-Continuous Mixtures in Probabilistic Programming: Generalized Semantics and Inference Algorithms

Yi Wu, Siddharth Srivastava, Nicholas Hay, Simon Du, Stuart Russell
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:5343-5352, 2018.

Abstract

Despite the recent successes of probabilistic programming languages (PPLs) in AI applications, PPLs offer only limited support for random variables whose distributions combine discrete and continuous elements. We develop the notion of measure-theoretic Bayesian networks (MTBNs) and use it to provide more general semantics for PPLs with arbitrarily many random variables defined over arbitrary measure spaces. We develop two new general sampling algorithms that are provably correct under the MTBN framework: the lexicographic likelihood weighting (LLW) for general MTBNs and the lexicographic particle filter (LPF), a specialized algorithm for state-space models. We further integrate MTBNs into a widely used PPL system, BLOG, and verify the effectiveness of the new inference algorithms through representative examples.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-wu18f, title = {Discrete-Continuous Mixtures in Probabilistic Programming: Generalized Semantics and Inference Algorithms}, author = {Wu, Yi and Srivastava, Siddharth and Hay, Nicholas and Du, Simon and Russell, Stuart}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {5343--5352}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/wu18f/wu18f.pdf}, url = {http://proceedings.mlr.press/v80/wu18f.html}, abstract = {Despite the recent successes of probabilistic programming languages (PPLs) in AI applications, PPLs offer only limited support for random variables whose distributions combine discrete and continuous elements. We develop the notion of measure-theoretic Bayesian networks (MTBNs) and use it to provide more general semantics for PPLs with arbitrarily many random variables defined over arbitrary measure spaces. We develop two new general sampling algorithms that are provably correct under the MTBN framework: the lexicographic likelihood weighting (LLW) for general MTBNs and the lexicographic particle filter (LPF), a specialized algorithm for state-space models. We further integrate MTBNs into a widely used PPL system, BLOG, and verify the effectiveness of the new inference algorithms through representative examples.} }
Endnote
%0 Conference Paper %T Discrete-Continuous Mixtures in Probabilistic Programming: Generalized Semantics and Inference Algorithms %A Yi Wu %A Siddharth Srivastava %A Nicholas Hay %A Simon Du %A Stuart Russell %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-wu18f %I PMLR %P 5343--5352 %U http://proceedings.mlr.press/v80/wu18f.html %V 80 %X Despite the recent successes of probabilistic programming languages (PPLs) in AI applications, PPLs offer only limited support for random variables whose distributions combine discrete and continuous elements. We develop the notion of measure-theoretic Bayesian networks (MTBNs) and use it to provide more general semantics for PPLs with arbitrarily many random variables defined over arbitrary measure spaces. We develop two new general sampling algorithms that are provably correct under the MTBN framework: the lexicographic likelihood weighting (LLW) for general MTBNs and the lexicographic particle filter (LPF), a specialized algorithm for state-space models. We further integrate MTBNs into a widely used PPL system, BLOG, and verify the effectiveness of the new inference algorithms through representative examples.
APA
Wu, Y., Srivastava, S., Hay, N., Du, S. & Russell, S.. (2018). Discrete-Continuous Mixtures in Probabilistic Programming: Generalized Semantics and Inference Algorithms. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:5343-5352 Available from http://proceedings.mlr.press/v80/wu18f.html.

Related Material