LF-PPL: A Low-Level First Order Probabilistic Programming Language for Non-Differentiable Models

Yuan Zhou, Bradley J. Gram-Hansen, Tobias Kohn, Tom Rainforth, Hongseok Yang, Frank Wood
; Proceedings of Machine Learning Research, PMLR 89:148-157, 2019.

Abstract

We develop a new Low-level, First-order Probabilistic Programming Language (LF-PPL) suited for models containing a mix of continuous, discrete, and/or piecewise-continuous variables. The key success of this language and its compilation scheme is in its ability to automatically distinguish parameters the density function is discontinuous with respect to, while further providing runtime checks for boundary crossings. This enables the introduction of new inference engines that are able to exploit gradient information, while remaining efficient for models which are not everywhere differentiable. We demonstrate this ability by incorporating a discontinuous Hamiltonian Monte Carlo (DHMC) inference engine that is able to deliver automated and efficient inference for non-differentiable models. Our system is backed up by a mathematical formalism that ensures that any model expressed in this language has a density with measure zero discontinuities to maintain the validity of the inference engine.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-zhou19b, title = {LF-PPL: A Low-Level First Order Probabilistic Programming Language for Non-Differentiable Models}, author = {Zhou, Yuan and Gram-Hansen, Bradley J. and Kohn, Tobias and Rainforth, Tom and Yang, Hongseok and Wood, Frank}, booktitle = {Proceedings of Machine Learning Research}, pages = {148--157}, year = {2019}, editor = {Kamalika Chaudhuri and Masashi Sugiyama}, volume = {89}, series = {Proceedings of Machine Learning Research}, address = {}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/zhou19b/zhou19b.pdf}, url = {http://proceedings.mlr.press/v89/zhou19b.html}, abstract = {We develop a new Low-level, First-order Probabilistic Programming Language (LF-PPL) suited for models containing a mix of continuous, discrete, and/or piecewise-continuous variables. The key success of this language and its compilation scheme is in its ability to automatically distinguish parameters the density function is discontinuous with respect to, while further providing runtime checks for boundary crossings. This enables the introduction of new inference engines that are able to exploit gradient information, while remaining efficient for models which are not everywhere differentiable. We demonstrate this ability by incorporating a discontinuous Hamiltonian Monte Carlo (DHMC) inference engine that is able to deliver automated and efficient inference for non-differentiable models. Our system is backed up by a mathematical formalism that ensures that any model expressed in this language has a density with measure zero discontinuities to maintain the validity of the inference engine.} }
Endnote
%0 Conference Paper %T LF-PPL: A Low-Level First Order Probabilistic Programming Language for Non-Differentiable Models %A Yuan Zhou %A Bradley J. Gram-Hansen %A Tobias Kohn %A Tom Rainforth %A Hongseok Yang %A Frank Wood %B Proceedings of Machine Learning Research %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-zhou19b %I PMLR %J Proceedings of Machine Learning Research %P 148--157 %U http://proceedings.mlr.press %V 89 %W PMLR %X We develop a new Low-level, First-order Probabilistic Programming Language (LF-PPL) suited for models containing a mix of continuous, discrete, and/or piecewise-continuous variables. The key success of this language and its compilation scheme is in its ability to automatically distinguish parameters the density function is discontinuous with respect to, while further providing runtime checks for boundary crossings. This enables the introduction of new inference engines that are able to exploit gradient information, while remaining efficient for models which are not everywhere differentiable. We demonstrate this ability by incorporating a discontinuous Hamiltonian Monte Carlo (DHMC) inference engine that is able to deliver automated and efficient inference for non-differentiable models. Our system is backed up by a mathematical formalism that ensures that any model expressed in this language has a density with measure zero discontinuities to maintain the validity of the inference engine.
APA
Zhou, Y., Gram-Hansen, B.J., Kohn, T., Rainforth, T., Yang, H. & Wood, F.. (2019). LF-PPL: A Low-Level First Order Probabilistic Programming Language for Non-Differentiable Models. Proceedings of Machine Learning Research, in PMLR 89:148-157

Related Material