Particle Belief Propagation

Alexander Ihler, David McAllester
; Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:256-263, 2009.

Abstract

The popularity of particle filtering for inference in Markov chain models defined over random variables with very large or continuous domains makes it natural to consider sample–based versions of belief propagation (BP) for more general (tree–structured or loopy) graphs. Already, several such algorithms have been proposed in the literature. However, many questions remain open about the behavior of particle–based BP algorithms, and little theory has been developed to analyze their performance. In this paper, we describe a generic particle belief propagation (PBP) algorithm which is closely related to previously proposed methods. We prove that this algorithm is consistent, approaching the true BP messages as the number of samples grows large. We then use concentration bounds to analyze the finite-sample behavior and give a convergence rate for the algorithm on tree–structured graphs. Our convergence rate is O(1/\sqrtn) where n is the number of samples, independent of the domain size of the variables.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-ihler09a, title = {Particle Belief Propagation}, author = {Alexander Ihler and David McAllester}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {256--263}, year = {2009}, editor = {David van Dyk and Max Welling}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/ihler09a/ihler09a.pdf}, url = {http://proceedings.mlr.press/v5/ihler09a.html}, abstract = {The popularity of particle filtering for inference in Markov chain models defined over random variables with very large or continuous domains makes it natural to consider sample–based versions of belief propagation (BP) for more general (tree–structured or loopy) graphs. Already, several such algorithms have been proposed in the literature. However, many questions remain open about the behavior of particle–based BP algorithms, and little theory has been developed to analyze their performance. In this paper, we describe a generic particle belief propagation (PBP) algorithm which is closely related to previously proposed methods. We prove that this algorithm is consistent, approaching the true BP messages as the number of samples grows large. We then use concentration bounds to analyze the finite-sample behavior and give a convergence rate for the algorithm on tree–structured graphs. Our convergence rate is O(1/\sqrtn) where n is the number of samples, independent of the domain size of the variables.} }
Endnote
%0 Conference Paper %T Particle Belief Propagation %A Alexander Ihler %A David McAllester %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-ihler09a %I PMLR %J Proceedings of Machine Learning Research %P 256--263 %U http://proceedings.mlr.press %V 5 %W PMLR %X The popularity of particle filtering for inference in Markov chain models defined over random variables with very large or continuous domains makes it natural to consider sample–based versions of belief propagation (BP) for more general (tree–structured or loopy) graphs. Already, several such algorithms have been proposed in the literature. However, many questions remain open about the behavior of particle–based BP algorithms, and little theory has been developed to analyze their performance. In this paper, we describe a generic particle belief propagation (PBP) algorithm which is closely related to previously proposed methods. We prove that this algorithm is consistent, approaching the true BP messages as the number of samples grows large. We then use concentration bounds to analyze the finite-sample behavior and give a convergence rate for the algorithm on tree–structured graphs. Our convergence rate is O(1/\sqrtn) where n is the number of samples, independent of the domain size of the variables.
RIS
TY - CPAPER TI - Particle Belief Propagation AU - Alexander Ihler AU - David McAllester BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics PY - 2009/04/15 DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-ihler09a PB - PMLR SP - 256 DP - PMLR EP - 263 L1 - http://proceedings.mlr.press/v5/ihler09a/ihler09a.pdf UR - http://proceedings.mlr.press/v5/ihler09a.html AB - The popularity of particle filtering for inference in Markov chain models defined over random variables with very large or continuous domains makes it natural to consider sample–based versions of belief propagation (BP) for more general (tree–structured or loopy) graphs. Already, several such algorithms have been proposed in the literature. However, many questions remain open about the behavior of particle–based BP algorithms, and little theory has been developed to analyze their performance. In this paper, we describe a generic particle belief propagation (PBP) algorithm which is closely related to previously proposed methods. We prove that this algorithm is consistent, approaching the true BP messages as the number of samples grows large. We then use concentration bounds to analyze the finite-sample behavior and give a convergence rate for the algorithm on tree–structured graphs. Our convergence rate is O(1/\sqrtn) where n is the number of samples, independent of the domain size of the variables. ER -
APA
Ihler, A. & McAllester, D.. (2009). Particle Belief Propagation. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in PMLR 5:256-263

Related Material