The Parameterized Complexity of Approximate Inference in Bayesian Networks

Johan Kwisthout
Proceedings of the Eighth International Conference on Probabilistic Graphical Models, PMLR 52:264-274, 2016.

Abstract

Computing posterior and marginal probabilities constitutes the backbone of almost all inferences in Bayesian networks. These computations are known to be intractable in general, both to compute exactly and to approximate by sampling algorithms. While it is well known under what constraints \em exact computation can be rendered tractable (viz., bounding tree-width of the moralized network and bounding the cardinality of the variables) it is less known under what constraints \em approximate Bayesian inference can be tractable. Here, we use the formal framework of \em fixed-error randomized tractability (a randomized analogue of fixed-parameter tractability) to address this problem, both by re-interpreting known results from the literature and providing some additional new results, including results on fixed parameter tractable de-randomization of approximate inference.

Cite this Paper


BibTeX
@InProceedings{pmlr-v52-kwisthout16, title = {The Parameterized Complexity of Approximate Inference in {B}ayesian Networks}, author = {Kwisthout, Johan}, booktitle = {Proceedings of the Eighth International Conference on Probabilistic Graphical Models}, pages = {264--274}, year = {2016}, editor = {Antonucci, Alessandro and Corani, Giorgio and Campos}, Cassio Polpo}, volume = {52}, series = {Proceedings of Machine Learning Research}, address = {Lugano, Switzerland}, month = {06--09 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v52/kwisthout16.pdf}, url = {https://proceedings.mlr.press/v52/kwisthout16.html}, abstract = {Computing posterior and marginal probabilities constitutes the backbone of almost all inferences in Bayesian networks. These computations are known to be intractable in general, both to compute exactly and to approximate by sampling algorithms. While it is well known under what constraints \em exact computation can be rendered tractable (viz., bounding tree-width of the moralized network and bounding the cardinality of the variables) it is less known under what constraints \em approximate Bayesian inference can be tractable. Here, we use the formal framework of \em fixed-error randomized tractability (a randomized analogue of fixed-parameter tractability) to address this problem, both by re-interpreting known results from the literature and providing some additional new results, including results on fixed parameter tractable de-randomization of approximate inference.} }
Endnote
%0 Conference Paper %T The Parameterized Complexity of Approximate Inference in Bayesian Networks %A Johan Kwisthout %B Proceedings of the Eighth International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2016 %E Alessandro Antonucci %E Giorgio Corani %E Cassio Polpo Campos} %F pmlr-v52-kwisthout16 %I PMLR %P 264--274 %U https://proceedings.mlr.press/v52/kwisthout16.html %V 52 %X Computing posterior and marginal probabilities constitutes the backbone of almost all inferences in Bayesian networks. These computations are known to be intractable in general, both to compute exactly and to approximate by sampling algorithms. While it is well known under what constraints \em exact computation can be rendered tractable (viz., bounding tree-width of the moralized network and bounding the cardinality of the variables) it is less known under what constraints \em approximate Bayesian inference can be tractable. Here, we use the formal framework of \em fixed-error randomized tractability (a randomized analogue of fixed-parameter tractability) to address this problem, both by re-interpreting known results from the literature and providing some additional new results, including results on fixed parameter tractable de-randomization of approximate inference.
RIS
TY - CPAPER TI - The Parameterized Complexity of Approximate Inference in Bayesian Networks AU - Johan Kwisthout BT - Proceedings of the Eighth International Conference on Probabilistic Graphical Models DA - 2016/08/15 ED - Alessandro Antonucci ED - Giorgio Corani ED - Cassio Polpo Campos} ID - pmlr-v52-kwisthout16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 52 SP - 264 EP - 274 L1 - http://proceedings.mlr.press/v52/kwisthout16.pdf UR - https://proceedings.mlr.press/v52/kwisthout16.html AB - Computing posterior and marginal probabilities constitutes the backbone of almost all inferences in Bayesian networks. These computations are known to be intractable in general, both to compute exactly and to approximate by sampling algorithms. While it is well known under what constraints \em exact computation can be rendered tractable (viz., bounding tree-width of the moralized network and bounding the cardinality of the variables) it is less known under what constraints \em approximate Bayesian inference can be tractable. Here, we use the formal framework of \em fixed-error randomized tractability (a randomized analogue of fixed-parameter tractability) to address this problem, both by re-interpreting known results from the literature and providing some additional new results, including results on fixed parameter tractable de-randomization of approximate inference. ER -
APA
Kwisthout, J.. (2016). The Parameterized Complexity of Approximate Inference in Bayesian Networks. Proceedings of the Eighth International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 52:264-274 Available from https://proceedings.mlr.press/v52/kwisthout16.html.

Related Material