Parallel Probabilistic Inference by Weighted Model Counting

Giso H. Dal, Alfons W. Laarman, Peter J.F. Lucas
Proceedings of the Ninth International Conference on Probabilistic Graphical Models, PMLR 72:97-108, 2018.

Abstract

Knowledge compilation as part of the Weighted Model Counting approach has proven to be an efficient tool for exact inference in probabilistic graphical models, by exploiting structures that more traditional methods can not. The availability of affordable high performance commodity hardware has been an inspiration for other inference approaches to exploit parallelism, to great suc- cess. In this paper, we explore the possibilities for Weighted Model Counting. We have empirically confirmed that exploited parallelism yields substantial speedups using a set of real-world Bayesian networks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v72-dal18a, title = {Parallel Probabilistic Inference by Weighted Model Counting}, author = {Dal, Giso H. and Laarman, Alfons W. and Lucas, Peter J.F.}, booktitle = {Proceedings of the Ninth International Conference on Probabilistic Graphical Models}, pages = {97--108}, year = {2018}, editor = {Kratochvíl, Václav and Studený, Milan}, volume = {72}, series = {Proceedings of Machine Learning Research}, month = {11--14 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v72/dal18a/dal18a.pdf}, url = {https://proceedings.mlr.press/v72/dal18a.html}, abstract = {Knowledge compilation as part of the Weighted Model Counting approach has proven to be an efficient tool for exact inference in probabilistic graphical models, by exploiting structures that more traditional methods can not. The availability of affordable high performance commodity hardware has been an inspiration for other inference approaches to exploit parallelism, to great suc- cess. In this paper, we explore the possibilities for Weighted Model Counting. We have empirically confirmed that exploited parallelism yields substantial speedups using a set of real-world Bayesian networks.} }
Endnote
%0 Conference Paper %T Parallel Probabilistic Inference by Weighted Model Counting %A Giso H. Dal %A Alfons W. Laarman %A Peter J.F. Lucas %B Proceedings of the Ninth International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2018 %E Václav Kratochvíl %E Milan Studený %F pmlr-v72-dal18a %I PMLR %P 97--108 %U https://proceedings.mlr.press/v72/dal18a.html %V 72 %X Knowledge compilation as part of the Weighted Model Counting approach has proven to be an efficient tool for exact inference in probabilistic graphical models, by exploiting structures that more traditional methods can not. The availability of affordable high performance commodity hardware has been an inspiration for other inference approaches to exploit parallelism, to great suc- cess. In this paper, we explore the possibilities for Weighted Model Counting. We have empirically confirmed that exploited parallelism yields substantial speedups using a set of real-world Bayesian networks.
APA
Dal, G.H., Laarman, A.W. & Lucas, P.J.. (2018). Parallel Probabilistic Inference by Weighted Model Counting. Proceedings of the Ninth International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 72:97-108 Available from https://proceedings.mlr.press/v72/dal18a.html.

Related Material