[edit]

# The Parameterized Complexity of Approximate Inference in Bayesian Networks

*Proceedings of the Eighth International Conference on Probabilistic Graphical Models*, PMLR 52:264-274, 2016.

#### Abstract

Computing posterior and marginal probabilities constitutes the backbone of almost all inferences in Bayesian networks. These computations are known to be intractable in general, both to compute exactly and to approximate by sampling algorithms. While it is well known under what constraints \em exact computation can be rendered tractable (viz., bounding tree-width of the moralized network and bounding the cardinality of the variables) it is less known under what constraints \em approximate Bayesian inference can be tractable. Here, we use the formal framework of \em fixed-error randomized tractability (a randomized analogue of fixed-parameter tractability) to address this problem, both by re-interpreting known results from the literature and providing some additional new results, including results on fixed parameter tractable de-randomization of approximate inference.