Loopy Belief Propagation for Bipartite Maximum Weight b-Matching

Bert Huang, Tony Jebara
; Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, PMLR 2:195-202, 2007.

Abstract

We formulate the weighted b-matching objective function as a probability distribution function and prove that belief propagation (BP) on its graphical model converges to the optimum. Standard BP on our graphical model cannot be computed in polynomial time, but we introduce an algebraic method to circumvent the combinatorial message updates. Empirically, the resulting algorithm is on average faster than popular combinatorial implementations, while still scaling at the same asymptotic rate of O(bn^3). Furthermore, the algorithm shows promising performance in machine learning applications.

Cite this Paper


BibTeX
@InProceedings{pmlr-v2-huang07a, title = {Loopy Belief Propagation for Bipartite Maximum Weight b-Matching}, author = {Bert Huang and Tony Jebara}, booktitle = {Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics}, pages = {195--202}, year = {2007}, editor = {Marina Meila and Xiaotong Shen}, volume = {2}, series = {Proceedings of Machine Learning Research}, address = {San Juan, Puerto Rico}, month = {21--24 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v2/huang07a/huang07a.pdf}, url = {http://proceedings.mlr.press/v2/huang07a.html}, abstract = {We formulate the weighted b-matching objective function as a probability distribution function and prove that belief propagation (BP) on its graphical model converges to the optimum. Standard BP on our graphical model cannot be computed in polynomial time, but we introduce an algebraic method to circumvent the combinatorial message updates. Empirically, the resulting algorithm is on average faster than popular combinatorial implementations, while still scaling at the same asymptotic rate of O(bn^3). Furthermore, the algorithm shows promising performance in machine learning applications.} }
Endnote
%0 Conference Paper %T Loopy Belief Propagation for Bipartite Maximum Weight b-Matching %A Bert Huang %A Tony Jebara %B Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2007 %E Marina Meila %E Xiaotong Shen %F pmlr-v2-huang07a %I PMLR %J Proceedings of Machine Learning Research %P 195--202 %U http://proceedings.mlr.press %V 2 %W PMLR %X We formulate the weighted b-matching objective function as a probability distribution function and prove that belief propagation (BP) on its graphical model converges to the optimum. Standard BP on our graphical model cannot be computed in polynomial time, but we introduce an algebraic method to circumvent the combinatorial message updates. Empirically, the resulting algorithm is on average faster than popular combinatorial implementations, while still scaling at the same asymptotic rate of O(bn^3). Furthermore, the algorithm shows promising performance in machine learning applications.
RIS
TY - CPAPER TI - Loopy Belief Propagation for Bipartite Maximum Weight b-Matching AU - Bert Huang AU - Tony Jebara BT - Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics PY - 2007/03/11 DA - 2007/03/11 ED - Marina Meila ED - Xiaotong Shen ID - pmlr-v2-huang07a PB - PMLR SP - 195 DP - PMLR EP - 202 L1 - http://proceedings.mlr.press/v2/huang07a/huang07a.pdf UR - http://proceedings.mlr.press/v2/huang07a.html AB - We formulate the weighted b-matching objective function as a probability distribution function and prove that belief propagation (BP) on its graphical model converges to the optimum. Standard BP on our graphical model cannot be computed in polynomial time, but we introduce an algebraic method to circumvent the combinatorial message updates. Empirically, the resulting algorithm is on average faster than popular combinatorial implementations, while still scaling at the same asymptotic rate of O(bn^3). Furthermore, the algorithm shows promising performance in machine learning applications. ER -
APA
Huang, B. & Jebara, T.. (2007). Loopy Belief Propagation for Bipartite Maximum Weight b-Matching. Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, in PMLR 2:195-202

Related Material