Tensor Belief Propagation

Andrew Wrigley, Wee Sun Lee, Nan Ye
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:3771-3779, 2017.

Abstract

We propose a new approximate inference algorithm for graphical models, tensor belief propagation, based on approximating the messages passed in the junction tree algorithm. Our algorithm represents the potential functions of the graphical model and all messages on the junction tree compactly as mixtures of rank-1 tensors. Using this representation, we show how to perform the operations required for inference on the junction tree efficiently: marginalisation can be computed quickly due to the factored form of rank-1 tensors while multiplication can be approximated using sampling. Our analysis gives sufficient conditions for the algorithm to perform well, including for the case of high-treewidth graphs, for which exact inference is intractable. We compare our algorithm experimentally with several approximate inference algorithms and show that it performs well.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-wrigley17a, title = {Tensor Belief Propagation}, author = {Andrew Wrigley and Wee Sun Lee and Nan Ye}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {3771--3779}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/wrigley17a/wrigley17a.pdf}, url = {https://proceedings.mlr.press/v70/wrigley17a.html}, abstract = {We propose a new approximate inference algorithm for graphical models, tensor belief propagation, based on approximating the messages passed in the junction tree algorithm. Our algorithm represents the potential functions of the graphical model and all messages on the junction tree compactly as mixtures of rank-1 tensors. Using this representation, we show how to perform the operations required for inference on the junction tree efficiently: marginalisation can be computed quickly due to the factored form of rank-1 tensors while multiplication can be approximated using sampling. Our analysis gives sufficient conditions for the algorithm to perform well, including for the case of high-treewidth graphs, for which exact inference is intractable. We compare our algorithm experimentally with several approximate inference algorithms and show that it performs well.} }
Endnote
%0 Conference Paper %T Tensor Belief Propagation %A Andrew Wrigley %A Wee Sun Lee %A Nan Ye %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-wrigley17a %I PMLR %P 3771--3779 %U https://proceedings.mlr.press/v70/wrigley17a.html %V 70 %X We propose a new approximate inference algorithm for graphical models, tensor belief propagation, based on approximating the messages passed in the junction tree algorithm. Our algorithm represents the potential functions of the graphical model and all messages on the junction tree compactly as mixtures of rank-1 tensors. Using this representation, we show how to perform the operations required for inference on the junction tree efficiently: marginalisation can be computed quickly due to the factored form of rank-1 tensors while multiplication can be approximated using sampling. Our analysis gives sufficient conditions for the algorithm to perform well, including for the case of high-treewidth graphs, for which exact inference is intractable. We compare our algorithm experimentally with several approximate inference algorithms and show that it performs well.
APA
Wrigley, A., Lee, W.S. & Ye, N.. (2017). Tensor Belief Propagation. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:3771-3779 Available from https://proceedings.mlr.press/v70/wrigley17a.html.

Related Material