Learning Ising Models with Independent Failures

Surbhi Goel, Daniel M. Kane, Adam R. Klivans
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:1449-1469, 2019.

Abstract

We give the first efficient algorithm for learning the structure of an Ising model that tolerates independent failures; that is, each entry of the observed sample is missing with some unknown probability $p$. Our algorithm matches the essentially optimal runtime and sample complexity bounds of recent work for learning Ising models due to Klivans and Meka (2017). We devise a novel unbiased estimator for the gradient of the Interaction Screening Objective (ISO) due to Vuffray et al. (2016) and apply a stochastic multiplicative gradient descent algorithm to minimize this objective. Solutions to this minimization recover the neighborhood information of the underlying Ising model on a node by node basis.

Cite this Paper


BibTeX
@InProceedings{pmlr-v99-goel19a, title = {Learning Ising Models with Independent Failures}, author = {Goel, Surbhi and Kane, Daniel M. and Klivans, Adam R.}, booktitle = {Proceedings of the Thirty-Second Conference on Learning Theory}, pages = {1449--1469}, year = {2019}, editor = {Beygelzimer, Alina and Hsu, Daniel}, volume = {99}, series = {Proceedings of Machine Learning Research}, month = {25--28 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v99/goel19a/goel19a.pdf}, url = {https://proceedings.mlr.press/v99/goel19a.html}, abstract = { We give the first efficient algorithm for learning the structure of an Ising model that tolerates independent failures; that is, each entry of the observed sample is missing with some unknown probability $p$. Our algorithm matches the essentially optimal runtime and sample complexity bounds of recent work for learning Ising models due to Klivans and Meka (2017). We devise a novel unbiased estimator for the gradient of the Interaction Screening Objective (ISO) due to Vuffray et al. (2016) and apply a stochastic multiplicative gradient descent algorithm to minimize this objective. Solutions to this minimization recover the neighborhood information of the underlying Ising model on a node by node basis. } }
Endnote
%0 Conference Paper %T Learning Ising Models with Independent Failures %A Surbhi Goel %A Daniel M. Kane %A Adam R. Klivans %B Proceedings of the Thirty-Second Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2019 %E Alina Beygelzimer %E Daniel Hsu %F pmlr-v99-goel19a %I PMLR %P 1449--1469 %U https://proceedings.mlr.press/v99/goel19a.html %V 99 %X We give the first efficient algorithm for learning the structure of an Ising model that tolerates independent failures; that is, each entry of the observed sample is missing with some unknown probability $p$. Our algorithm matches the essentially optimal runtime and sample complexity bounds of recent work for learning Ising models due to Klivans and Meka (2017). We devise a novel unbiased estimator for the gradient of the Interaction Screening Objective (ISO) due to Vuffray et al. (2016) and apply a stochastic multiplicative gradient descent algorithm to minimize this objective. Solutions to this minimization recover the neighborhood information of the underlying Ising model on a node by node basis.
APA
Goel, S., Kane, D.M. & Klivans, A.R.. (2019). Learning Ising Models with Independent Failures. Proceedings of the Thirty-Second Conference on Learning Theory, in Proceedings of Machine Learning Research 99:1449-1469 Available from https://proceedings.mlr.press/v99/goel19a.html.

Related Material