[edit]
Learning Ising and Potts Models with Latent Variables
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:3557-3566, 2020.
Abstract
We study the problem of learning graphical models with latent variables. We give the {\em first} efficient algorithms for learning: 1) ferromagnetic Ising models with latent variables under {\em arbitrary} external fields, and 2) ferromagnetic Potts model with latent variables under unidirectional non-negative external field. Our algorithms have optimal dependence on the dimension but suffer from a sub-optimal dependence on the underlying sparsity of the graph.Our results rely on two structural properties of the underlying graphical models. These in turn allow us to design an influence function which can be maximized greedily to recover the structure of the underlying graphical model. These structural results may be of independent interest.