EiGLasso: Scalable Estimation of Cartesian Product of Sparse Inverse Covariance Matrices

Jun Ho Yoon, Seyoung Kim
Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), PMLR 124:1248-1257, 2020.

Abstract

In this paper, we address the problem of jointly estimating dependencies across samples and dependencies across multiple features, where each set of dependencies is modeled as an inverse covariance matrix. In particular, we study a matrix-variate Gaussian distribution with the Kronecker-sum of sample-wise and feature-wise inverse covariances. While this Kronecker-sum model has been studied as an intuitively more appealing convex alternative to the Kronecker-product of two inverse covariance matrices, the existing methods do not scale to large datasets. We introduce a highly-efficient optimization method for estimating the Kronecker-sum structured inverse covariance matrix from matrix-variate data. In addition, we describe an alternative simpler approach for handling the non-identifiability of parameters than the strategies proposed in previous works. Using simulated and real data, we demonstrate our approach leads to one or two orders-of-magnitude speedup of the previous methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v124-ho-yoon20a, title = {EiGLasso: Scalable Estimation of Cartesian Product of Sparse Inverse Covariance Matrices}, author = {Yoon, Jun Ho and Kim, Seyoung}, booktitle = {Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI)}, pages = {1248--1257}, year = {2020}, editor = {Peters, Jonas and Sontag, David}, volume = {124}, series = {Proceedings of Machine Learning Research}, month = {03--06 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v124/ho-yoon20a/ho-yoon20a.pdf}, url = {https://proceedings.mlr.press/v124/ho-yoon20a.html}, abstract = {In this paper, we address the problem of jointly estimating dependencies across samples and dependencies across multiple features, where each set of dependencies is modeled as an inverse covariance matrix. In particular, we study a matrix-variate Gaussian distribution with the Kronecker-sum of sample-wise and feature-wise inverse covariances. While this Kronecker-sum model has been studied as an intuitively more appealing convex alternative to the Kronecker-product of two inverse covariance matrices, the existing methods do not scale to large datasets. We introduce a highly-efficient optimization method for estimating the Kronecker-sum structured inverse covariance matrix from matrix-variate data. In addition, we describe an alternative simpler approach for handling the non-identifiability of parameters than the strategies proposed in previous works. Using simulated and real data, we demonstrate our approach leads to one or two orders-of-magnitude speedup of the previous methods.} }
Endnote
%0 Conference Paper %T EiGLasso: Scalable Estimation of Cartesian Product of Sparse Inverse Covariance Matrices %A Jun Ho Yoon %A Seyoung Kim %B Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI) %C Proceedings of Machine Learning Research %D 2020 %E Jonas Peters %E David Sontag %F pmlr-v124-ho-yoon20a %I PMLR %P 1248--1257 %U https://proceedings.mlr.press/v124/ho-yoon20a.html %V 124 %X In this paper, we address the problem of jointly estimating dependencies across samples and dependencies across multiple features, where each set of dependencies is modeled as an inverse covariance matrix. In particular, we study a matrix-variate Gaussian distribution with the Kronecker-sum of sample-wise and feature-wise inverse covariances. While this Kronecker-sum model has been studied as an intuitively more appealing convex alternative to the Kronecker-product of two inverse covariance matrices, the existing methods do not scale to large datasets. We introduce a highly-efficient optimization method for estimating the Kronecker-sum structured inverse covariance matrix from matrix-variate data. In addition, we describe an alternative simpler approach for handling the non-identifiability of parameters than the strategies proposed in previous works. Using simulated and real data, we demonstrate our approach leads to one or two orders-of-magnitude speedup of the previous methods.
APA
Yoon, J.H. & Kim, S.. (2020). EiGLasso: Scalable Estimation of Cartesian Product of Sparse Inverse Covariance Matrices. Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), in Proceedings of Machine Learning Research 124:1248-1257 Available from https://proceedings.mlr.press/v124/ho-yoon20a.html.

Related Material