Outlier-Robust Learning of Ising Models Under Dobrushin’s Condition

Ilias Diakonikolas, Daniel M. Kane, Alistair Stewart, Yuxin Sun
Proceedings of Thirty Fourth Conference on Learning Theory, PMLR 134:1645-1682, 2021.

Abstract

We study the problem of learning Ising models satisfying Dobrushin’s condition in the outlier-robust setting where a constant fraction of the samples are adversarially corrupted. Our main result is to provide the first computationally efficient robust learning algorithm for this problem with near-optimal error guarantees. Our algorithm can be seen as a special case of an algorithm for robustly learning a distribution from a general exponential family. To prove its correctness for Ising models, we establish new anti-concentration results for degree-2 polynomials of Ising models that may be of independent interest.

Cite this Paper


BibTeX
@InProceedings{pmlr-v134-diakonikolas21e, title = {Outlier-Robust Learning of Ising Models Under Dobrushin’s Condition}, author = {Diakonikolas, Ilias and Kane, Daniel M. and Stewart, Alistair and Sun, Yuxin}, booktitle = {Proceedings of Thirty Fourth Conference on Learning Theory}, pages = {1645--1682}, year = {2021}, editor = {Belkin, Mikhail and Kpotufe, Samory}, volume = {134}, series = {Proceedings of Machine Learning Research}, month = {15--19 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v134/diakonikolas21e/diakonikolas21e.pdf}, url = {https://proceedings.mlr.press/v134/diakonikolas21e.html}, abstract = {We study the problem of learning Ising models satisfying Dobrushin’s condition in the outlier-robust setting where a constant fraction of the samples are adversarially corrupted. Our main result is to provide the first computationally efficient robust learning algorithm for this problem with near-optimal error guarantees. Our algorithm can be seen as a special case of an algorithm for robustly learning a distribution from a general exponential family. To prove its correctness for Ising models, we establish new anti-concentration results for degree-2 polynomials of Ising models that may be of independent interest.} }
Endnote
%0 Conference Paper %T Outlier-Robust Learning of Ising Models Under Dobrushin’s Condition %A Ilias Diakonikolas %A Daniel M. Kane %A Alistair Stewart %A Yuxin Sun %B Proceedings of Thirty Fourth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2021 %E Mikhail Belkin %E Samory Kpotufe %F pmlr-v134-diakonikolas21e %I PMLR %P 1645--1682 %U https://proceedings.mlr.press/v134/diakonikolas21e.html %V 134 %X We study the problem of learning Ising models satisfying Dobrushin’s condition in the outlier-robust setting where a constant fraction of the samples are adversarially corrupted. Our main result is to provide the first computationally efficient robust learning algorithm for this problem with near-optimal error guarantees. Our algorithm can be seen as a special case of an algorithm for robustly learning a distribution from a general exponential family. To prove its correctness for Ising models, we establish new anti-concentration results for degree-2 polynomials of Ising models that may be of independent interest.
APA
Diakonikolas, I., Kane, D.M., Stewart, A. & Sun, Y.. (2021). Outlier-Robust Learning of Ising Models Under Dobrushin’s Condition. Proceedings of Thirty Fourth Conference on Learning Theory, in Proceedings of Machine Learning Research 134:1645-1682 Available from https://proceedings.mlr.press/v134/diakonikolas21e.html.

Related Material