Probabilistic Programs with Stochastic Conditioning

David Tolpin, Yuan Zhou, Tom Rainforth, Hongseok Yang
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:10312-10323, 2021.

Abstract

We tackle the problem of conditioning probabilistic programs on distributions of observable variables. Probabilistic programs are usually conditioned on samples from the joint data distribution, which we refer to as deterministic conditioning. However, in many real-life scenarios, the observations are given as marginal distributions, summary statistics, or samplers. Conventional probabilistic programming systems lack adequate means for modeling and inference in such scenarios. We propose a generalization of deterministic conditioning to stochastic conditioning, that is, conditioning on the marginal distribution of a variable taking a particular form. To this end, we first define the formal notion of stochastic conditioning and discuss its key properties. We then show how to perform inference in the presence of stochastic conditioning. We demonstrate potential usage of stochastic conditioning on several case studies which involve various kinds of stochastic conditioning and are difficult to solve otherwise. Although we present stochastic conditioning in the context of probabilistic programming, our formalization is general and applicable to other settings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-tolpin21a, title = {Probabilistic Programs with Stochastic Conditioning}, author = {Tolpin, David and Zhou, Yuan and Rainforth, Tom and Yang, Hongseok}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {10312--10323}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/tolpin21a/tolpin21a.pdf}, url = {https://proceedings.mlr.press/v139/tolpin21a.html}, abstract = {We tackle the problem of conditioning probabilistic programs on distributions of observable variables. Probabilistic programs are usually conditioned on samples from the joint data distribution, which we refer to as deterministic conditioning. However, in many real-life scenarios, the observations are given as marginal distributions, summary statistics, or samplers. Conventional probabilistic programming systems lack adequate means for modeling and inference in such scenarios. We propose a generalization of deterministic conditioning to stochastic conditioning, that is, conditioning on the marginal distribution of a variable taking a particular form. To this end, we first define the formal notion of stochastic conditioning and discuss its key properties. We then show how to perform inference in the presence of stochastic conditioning. We demonstrate potential usage of stochastic conditioning on several case studies which involve various kinds of stochastic conditioning and are difficult to solve otherwise. Although we present stochastic conditioning in the context of probabilistic programming, our formalization is general and applicable to other settings.} }
Endnote
%0 Conference Paper %T Probabilistic Programs with Stochastic Conditioning %A David Tolpin %A Yuan Zhou %A Tom Rainforth %A Hongseok Yang %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-tolpin21a %I PMLR %P 10312--10323 %U https://proceedings.mlr.press/v139/tolpin21a.html %V 139 %X We tackle the problem of conditioning probabilistic programs on distributions of observable variables. Probabilistic programs are usually conditioned on samples from the joint data distribution, which we refer to as deterministic conditioning. However, in many real-life scenarios, the observations are given as marginal distributions, summary statistics, or samplers. Conventional probabilistic programming systems lack adequate means for modeling and inference in such scenarios. We propose a generalization of deterministic conditioning to stochastic conditioning, that is, conditioning on the marginal distribution of a variable taking a particular form. To this end, we first define the formal notion of stochastic conditioning and discuss its key properties. We then show how to perform inference in the presence of stochastic conditioning. We demonstrate potential usage of stochastic conditioning on several case studies which involve various kinds of stochastic conditioning and are difficult to solve otherwise. Although we present stochastic conditioning in the context of probabilistic programming, our formalization is general and applicable to other settings.
APA
Tolpin, D., Zhou, Y., Rainforth, T. & Yang, H.. (2021). Probabilistic Programs with Stochastic Conditioning. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:10312-10323 Available from https://proceedings.mlr.press/v139/tolpin21a.html.

Related Material