Block Stability for MAP Inference

Hunter Lang, David Sontag, Aravindan Vijayaraghavan
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:216-225, 2019.

Abstract

Recent work (Lang et al., 2018) has shown that some popular approximate MAP inference algorithms perform very well when the input instance is stable. The simplest stability condition assumes that the MAP solution does not change at all when some of the pairwise potentials are adversarially perturbed. Unfortunately, this strong condition does not seem to hold in practice. We introduce a significantly more relaxed condition that only requires portions of an input instance to be stable. Under this block stability condition, we prove that the pairwise LP relaxation is persistent on the stable blocks. We complement our theoretical results with an evaluation of real-world examples from computer vision, and we find that these instances have large stable regions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-lang19a, title = {Block Stability for MAP Inference}, author = {Lang, Hunter and Sontag, David and Vijayaraghavan, Aravindan}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {216--225}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/lang19a/lang19a.pdf}, url = {https://proceedings.mlr.press/v89/lang19a.html}, abstract = {Recent work (Lang et al., 2018) has shown that some popular approximate MAP inference algorithms perform very well when the input instance is stable. The simplest stability condition assumes that the MAP solution does not change at all when some of the pairwise potentials are adversarially perturbed. Unfortunately, this strong condition does not seem to hold in practice. We introduce a significantly more relaxed condition that only requires portions of an input instance to be stable. Under this block stability condition, we prove that the pairwise LP relaxation is persistent on the stable blocks. We complement our theoretical results with an evaluation of real-world examples from computer vision, and we find that these instances have large stable regions.} }
Endnote
%0 Conference Paper %T Block Stability for MAP Inference %A Hunter Lang %A David Sontag %A Aravindan Vijayaraghavan %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-lang19a %I PMLR %P 216--225 %U https://proceedings.mlr.press/v89/lang19a.html %V 89 %X Recent work (Lang et al., 2018) has shown that some popular approximate MAP inference algorithms perform very well when the input instance is stable. The simplest stability condition assumes that the MAP solution does not change at all when some of the pairwise potentials are adversarially perturbed. Unfortunately, this strong condition does not seem to hold in practice. We introduce a significantly more relaxed condition that only requires portions of an input instance to be stable. Under this block stability condition, we prove that the pairwise LP relaxation is persistent on the stable blocks. We complement our theoretical results with an evaluation of real-world examples from computer vision, and we find that these instances have large stable regions.
APA
Lang, H., Sontag, D. & Vijayaraghavan, A.. (2019). Block Stability for MAP Inference. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:216-225 Available from https://proceedings.mlr.press/v89/lang19a.html.

Related Material