Unifying Local Consistency and MAX SAT Relaxations for Scalable Inference with Rounding Guarantees

Stephen Bach, Bert Huang, Lise Getoor
Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, PMLR 38:46-55, 2015.

Abstract

We prove the equivalence of first-order local consistency relaxations and the MAX SAT relaxation of Goemans and Williamson (1994) for a class of MRFs we refer to as logical MRFs. This allows us to combine the advantages of each into a single MAP inference technique: solving the local consistency relaxation with any of a number of highly scalable message-passing algorithms, and then obtaining a high-quality discrete solution via a guaranteed rounding procedure when the relaxation is not tight. Logical MRFs are a general class of models that can incorporate many common dependencies, such as logical implications and mixtures of supermodular and submodular potentials. They can be used for many structured prediction tasks, including natural language processing, computer vision, and computational social science. We show that our new inference technique can improve solution quality by as much as 20% without sacrificing speed on problems with over one million dependencies.

Cite this Paper


BibTeX
@InProceedings{pmlr-v38-bach15, title = {{Unifying Local Consistency and MAX SAT Relaxations for Scalable Inference with Rounding Guarantees}}, author = {Bach, Stephen and Huang, Bert and Getoor, Lise}, booktitle = {Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics}, pages = {46--55}, year = {2015}, editor = {Lebanon, Guy and Vishwanathan, S. V. N.}, volume = {38}, series = {Proceedings of Machine Learning Research}, address = {San Diego, California, USA}, month = {09--12 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v38/bach15.pdf}, url = {https://proceedings.mlr.press/v38/bach15.html}, abstract = {We prove the equivalence of first-order local consistency relaxations and the MAX SAT relaxation of Goemans and Williamson (1994) for a class of MRFs we refer to as logical MRFs. This allows us to combine the advantages of each into a single MAP inference technique: solving the local consistency relaxation with any of a number of highly scalable message-passing algorithms, and then obtaining a high-quality discrete solution via a guaranteed rounding procedure when the relaxation is not tight. Logical MRFs are a general class of models that can incorporate many common dependencies, such as logical implications and mixtures of supermodular and submodular potentials. They can be used for many structured prediction tasks, including natural language processing, computer vision, and computational social science. We show that our new inference technique can improve solution quality by as much as 20% without sacrificing speed on problems with over one million dependencies.} }
Endnote
%0 Conference Paper %T Unifying Local Consistency and MAX SAT Relaxations for Scalable Inference with Rounding Guarantees %A Stephen Bach %A Bert Huang %A Lise Getoor %B Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2015 %E Guy Lebanon %E S. V. N. Vishwanathan %F pmlr-v38-bach15 %I PMLR %P 46--55 %U https://proceedings.mlr.press/v38/bach15.html %V 38 %X We prove the equivalence of first-order local consistency relaxations and the MAX SAT relaxation of Goemans and Williamson (1994) for a class of MRFs we refer to as logical MRFs. This allows us to combine the advantages of each into a single MAP inference technique: solving the local consistency relaxation with any of a number of highly scalable message-passing algorithms, and then obtaining a high-quality discrete solution via a guaranteed rounding procedure when the relaxation is not tight. Logical MRFs are a general class of models that can incorporate many common dependencies, such as logical implications and mixtures of supermodular and submodular potentials. They can be used for many structured prediction tasks, including natural language processing, computer vision, and computational social science. We show that our new inference technique can improve solution quality by as much as 20% without sacrificing speed on problems with over one million dependencies.
RIS
TY - CPAPER TI - Unifying Local Consistency and MAX SAT Relaxations for Scalable Inference with Rounding Guarantees AU - Stephen Bach AU - Bert Huang AU - Lise Getoor BT - Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics DA - 2015/02/21 ED - Guy Lebanon ED - S. V. N. Vishwanathan ID - pmlr-v38-bach15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 38 SP - 46 EP - 55 L1 - http://proceedings.mlr.press/v38/bach15.pdf UR - https://proceedings.mlr.press/v38/bach15.html AB - We prove the equivalence of first-order local consistency relaxations and the MAX SAT relaxation of Goemans and Williamson (1994) for a class of MRFs we refer to as logical MRFs. This allows us to combine the advantages of each into a single MAP inference technique: solving the local consistency relaxation with any of a number of highly scalable message-passing algorithms, and then obtaining a high-quality discrete solution via a guaranteed rounding procedure when the relaxation is not tight. Logical MRFs are a general class of models that can incorporate many common dependencies, such as logical implications and mixtures of supermodular and submodular potentials. They can be used for many structured prediction tasks, including natural language processing, computer vision, and computational social science. We show that our new inference technique can improve solution quality by as much as 20% without sacrificing speed on problems with over one million dependencies. ER -
APA
Bach, S., Huang, B. & Getoor, L.. (2015). Unifying Local Consistency and MAX SAT Relaxations for Scalable Inference with Rounding Guarantees. Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 38:46-55 Available from https://proceedings.mlr.press/v38/bach15.html.

Related Material