HOP-MAP: Efficient Message Passing with High Order Potentials

Daniel Tarlow, Inmar Givoni, Richard Zemel
Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, PMLR 9:812-819, 2010.

Abstract

There is a growing interest in building probabilistic models with high order potentials (HOPs), or interactions, among discrete variables. Message passing inference in such models generally takes time exponential in the size of the interaction, but in some cases maximum a posteriori (MAP) inference can be carried out efficiently. We build upon such results, introducing two new classes, including composite HOPs that allow us to flexibly combine tractable HOPs using simple logical switching rules. We present efficient message update algorithms for the new HOPs, and we improve upon the efficiency of message updates for a general class of existing HOPs. Importantly, we present both new and existing HOPs in a common representation; performing inference with any combination of these HOPs requires no change of representations or new derivations.

Cite this Paper


BibTeX
@InProceedings{pmlr-v9-tarlow10a, title = {HOP-MAP: Efficient Message Passing with High Order Potentials}, author = {Tarlow, Daniel and Givoni, Inmar and Zemel, Richard}, booktitle = {Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics}, pages = {812--819}, year = {2010}, editor = {Teh, Yee Whye and Titterington, Mike}, volume = {9}, series = {Proceedings of Machine Learning Research}, address = {Chia Laguna Resort, Sardinia, Italy}, month = {13--15 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v9/tarlow10a/tarlow10a.pdf}, url = {https://proceedings.mlr.press/v9/tarlow10a.html}, abstract = {There is a growing interest in building probabilistic models with high order potentials (HOPs), or interactions, among discrete variables. Message passing inference in such models generally takes time exponential in the size of the interaction, but in some cases maximum a posteriori (MAP) inference can be carried out efficiently. We build upon such results, introducing two new classes, including composite HOPs that allow us to flexibly combine tractable HOPs using simple logical switching rules. We present efficient message update algorithms for the new HOPs, and we improve upon the efficiency of message updates for a general class of existing HOPs. Importantly, we present both new and existing HOPs in a common representation; performing inference with any combination of these HOPs requires no change of representations or new derivations.} }
Endnote
%0 Conference Paper %T HOP-MAP: Efficient Message Passing with High Order Potentials %A Daniel Tarlow %A Inmar Givoni %A Richard Zemel %B Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2010 %E Yee Whye Teh %E Mike Titterington %F pmlr-v9-tarlow10a %I PMLR %P 812--819 %U https://proceedings.mlr.press/v9/tarlow10a.html %V 9 %X There is a growing interest in building probabilistic models with high order potentials (HOPs), or interactions, among discrete variables. Message passing inference in such models generally takes time exponential in the size of the interaction, but in some cases maximum a posteriori (MAP) inference can be carried out efficiently. We build upon such results, introducing two new classes, including composite HOPs that allow us to flexibly combine tractable HOPs using simple logical switching rules. We present efficient message update algorithms for the new HOPs, and we improve upon the efficiency of message updates for a general class of existing HOPs. Importantly, we present both new and existing HOPs in a common representation; performing inference with any combination of these HOPs requires no change of representations or new derivations.
RIS
TY - CPAPER TI - HOP-MAP: Efficient Message Passing with High Order Potentials AU - Daniel Tarlow AU - Inmar Givoni AU - Richard Zemel BT - Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics DA - 2010/03/31 ED - Yee Whye Teh ED - Mike Titterington ID - pmlr-v9-tarlow10a PB - PMLR DP - Proceedings of Machine Learning Research VL - 9 SP - 812 EP - 819 L1 - http://proceedings.mlr.press/v9/tarlow10a/tarlow10a.pdf UR - https://proceedings.mlr.press/v9/tarlow10a.html AB - There is a growing interest in building probabilistic models with high order potentials (HOPs), or interactions, among discrete variables. Message passing inference in such models generally takes time exponential in the size of the interaction, but in some cases maximum a posteriori (MAP) inference can be carried out efficiently. We build upon such results, introducing two new classes, including composite HOPs that allow us to flexibly combine tractable HOPs using simple logical switching rules. We present efficient message update algorithms for the new HOPs, and we improve upon the efficiency of message updates for a general class of existing HOPs. Importantly, we present both new and existing HOPs in a common representation; performing inference with any combination of these HOPs requires no change of representations or new derivations. ER -
APA
Tarlow, D., Givoni, I. & Zemel, R.. (2010). HOP-MAP: Efficient Message Passing with High Order Potentials. Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 9:812-819 Available from https://proceedings.mlr.press/v9/tarlow10a.html.

Related Material