Bean Machine: A Declarative Probabilistic Programming Language For Efficient Programmable Inference

Nazanin Tehrani, Nimar S. Arora, Yucen Lily Li, Kinjal Divesh Shah, David Noursi, Michael Tingley, Narjes Torabi, Sepehr = 485-496, Eric Lippert, Erik Meijer
Proceedings of the 10th International Conference on Probabilistic Graphical Models, PMLR 138:485-496, 2020.

Abstract

A number of imperative Probabilistic Programming Languages (PPLs) have been recently proposed, but the imperative style choice makes it very hard to deduce the dependence structure between the latent variables, which can also change from iteration to iteration. We propose a new declarative style PPL, Bean Machine, and demonstrate that in this new language, the dynamic dependence structure is readily available. Although we are not the first to propose a declarative PPL or to observe the advantages of knowing the dependence structure, we take the idea further by showing other inference techniques that become feasible or easier in this style. We show that it is very easy for users to program inference by composition (combining different inference techniques for different parts of the model), customization (providing a custom hand-written inference method for specific variables), and blocking (specifying blocks of random variables that should be sampled together) in a declarative language. A number of empirical results are provided where we backup these claims modulo the runtime inefficiencies of unvectorized Python. As a fringe benefit, we note that it is very easy to translate statistical models written in mathematical notation into our language.

Cite this Paper


BibTeX
@InProceedings{pmlr-v138-tehrani20a, title = {Bean Machine: A Declarative Probabilistic Programming Language For Efficient Programmable Inference}, author = {Tehrani, Nazanin and Arora, Nimar S. and Li, Yucen Lily and Shah, Kinjal Divesh and Noursi, David and Tingley, Michael and Torabi, Narjes and Masouleh, Sepehr and Lippert, Eric and Meijer, Erik}, booktitle = {Proceedings of the 10th International Conference on Probabilistic Graphical Models}, pages = {485--496}, year = {2020}, editor = {Jaeger, Manfred and Nielsen, Thomas Dyhre}, volume = {138}, series = {Proceedings of Machine Learning Research}, month = {23--25 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v138/tehrani20a/tehrani20a.pdf}, url = {https://proceedings.mlr.press/v138/tehrani20a.html}, abstract = {A number of imperative Probabilistic Programming Languages (PPLs) have been recently proposed, but the imperative style choice makes it very hard to deduce the dependence structure between the latent variables, which can also change from iteration to iteration. We propose a new declarative style PPL, Bean Machine, and demonstrate that in this new language, the dynamic dependence structure is readily available. Although we are not the first to propose a declarative PPL or to observe the advantages of knowing the dependence structure, we take the idea further by showing other inference techniques that become feasible or easier in this style. We show that it is very easy for users to program inference by composition (combining different inference techniques for different parts of the model), customization (providing a custom hand-written inference method for specific variables), and blocking (specifying blocks of random variables that should be sampled together) in a declarative language. A number of empirical results are provided where we backup these claims modulo the runtime inefficiencies of unvectorized Python. As a fringe benefit, we note that it is very easy to translate statistical models written in mathematical notation into our language.} }
Endnote
%0 Conference Paper %T Bean Machine: A Declarative Probabilistic Programming Language For Efficient Programmable Inference %A Nazanin Tehrani %A Nimar S. Arora %A Yucen Lily Li %A Kinjal Divesh Shah %A David Noursi %A Michael Tingley %A Narjes Torabi %A Sepehr = 485-496 %A Eric Lippert %A Erik Meijer %B Proceedings of the 10th International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2020 %E Manfred Jaeger %E Thomas Dyhre Nielsen %F pmlr-v138-tehrani20a %I PMLR %P 485--496 %U https://proceedings.mlr.press/v138/tehrani20a.html %V 138 %X A number of imperative Probabilistic Programming Languages (PPLs) have been recently proposed, but the imperative style choice makes it very hard to deduce the dependence structure between the latent variables, which can also change from iteration to iteration. We propose a new declarative style PPL, Bean Machine, and demonstrate that in this new language, the dynamic dependence structure is readily available. Although we are not the first to propose a declarative PPL or to observe the advantages of knowing the dependence structure, we take the idea further by showing other inference techniques that become feasible or easier in this style. We show that it is very easy for users to program inference by composition (combining different inference techniques for different parts of the model), customization (providing a custom hand-written inference method for specific variables), and blocking (specifying blocks of random variables that should be sampled together) in a declarative language. A number of empirical results are provided where we backup these claims modulo the runtime inefficiencies of unvectorized Python. As a fringe benefit, we note that it is very easy to translate statistical models written in mathematical notation into our language.
APA
Tehrani, N., Arora, N.S., Li, Y.L., Shah, K.D., Noursi, D., Tingley, M., Torabi, N., = 485-496, S., Lippert, E. & Meijer, E.. (2020). Bean Machine: A Declarative Probabilistic Programming Language For Efficient Programmable Inference. Proceedings of the 10th International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 138:485-496 Available from https://proceedings.mlr.press/v138/tehrani20a.html.

Related Material