Graphical Nonconvex Optimization via an Adaptive Convex Relaxation

Qiang Sun, Kean Ming Tan, Han Liu, Tong Zhang
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:4810-4817, 2018.

Abstract

We consider the problem of learning high-dimensional Gaussian graphical models. The graphical lasso is one of the most popular methods for estimating Gaussian graphical models. However, it does not achieve the oracle rate of convergence. In this paper, we propose the graphical nonconvex optimization for optimal estimation in Gaussian graphical models, which is then approximated by a sequence of convex programs. Our proposal is computationally tractable and produces an estimator that achieves the oracle rate of convergence. The statistical error introduced by the sequential approximation using a sequence of convex programs is clearly demonstrated via a contraction property. The proposed methodology is then extended to modeling semiparametric graphical models. We show via numerical studies that the proposed estimator outperforms other popular methods for estimating Gaussian graphical models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-sun18c, title = {Graphical Nonconvex Optimization via an Adaptive Convex Relaxation}, author = {Sun, Qiang and Tan, Kean Ming and Liu, Han and Zhang, Tong}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {4810--4817}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/sun18c/sun18c.pdf}, url = {https://proceedings.mlr.press/v80/sun18c.html}, abstract = {We consider the problem of learning high-dimensional Gaussian graphical models. The graphical lasso is one of the most popular methods for estimating Gaussian graphical models. However, it does not achieve the oracle rate of convergence. In this paper, we propose the graphical nonconvex optimization for optimal estimation in Gaussian graphical models, which is then approximated by a sequence of convex programs. Our proposal is computationally tractable and produces an estimator that achieves the oracle rate of convergence. The statistical error introduced by the sequential approximation using a sequence of convex programs is clearly demonstrated via a contraction property. The proposed methodology is then extended to modeling semiparametric graphical models. We show via numerical studies that the proposed estimator outperforms other popular methods for estimating Gaussian graphical models.} }
Endnote
%0 Conference Paper %T Graphical Nonconvex Optimization via an Adaptive Convex Relaxation %A Qiang Sun %A Kean Ming Tan %A Han Liu %A Tong Zhang %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-sun18c %I PMLR %P 4810--4817 %U https://proceedings.mlr.press/v80/sun18c.html %V 80 %X We consider the problem of learning high-dimensional Gaussian graphical models. The graphical lasso is one of the most popular methods for estimating Gaussian graphical models. However, it does not achieve the oracle rate of convergence. In this paper, we propose the graphical nonconvex optimization for optimal estimation in Gaussian graphical models, which is then approximated by a sequence of convex programs. Our proposal is computationally tractable and produces an estimator that achieves the oracle rate of convergence. The statistical error introduced by the sequential approximation using a sequence of convex programs is clearly demonstrated via a contraction property. The proposed methodology is then extended to modeling semiparametric graphical models. We show via numerical studies that the proposed estimator outperforms other popular methods for estimating Gaussian graphical models.
APA
Sun, Q., Tan, K.M., Liu, H. & Zhang, T.. (2018). Graphical Nonconvex Optimization via an Adaptive Convex Relaxation. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:4810-4817 Available from https://proceedings.mlr.press/v80/sun18c.html.

Related Material