Efficient Algorithm for Sparse Tensor-variate Gaussian Graphical Models via Gradient Descent

Pan Xu, Tingting Zhang, Quanquan Gu
Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, PMLR 54:923-932, 2017.

Abstract

We study the sparse tensor-variate Gaussian graphical model (STGGM), where each way of the tensor follows a multivariate normal distribution whose precision matrix has sparse structures. In order to estimate the precision matrices, we propose a sparsity constrained maximum likelihood estimator. However, due to the complex structure of the tensor-variate GGMs, the likelihood based estimator is non-convex, which poses great challenges for both computation and theoretical analysis. In order to address these challenges, we propose an efficient alternating gradient descent algorithm to solve this estimator, and prove that, under certain conditions on the initial estimator, our algorithm is guaranteed to linearly converge to the unknown precision matrices up to the optimal statistical error. Experiments on both synthetic data and real world brain imaging data corroborate our theory.

Cite this Paper


BibTeX
@InProceedings{pmlr-v54-xu17b, title = {{Efficient Algorithm for Sparse Tensor-variate Gaussian Graphical Models via Gradient Descent}}, author = {Xu, Pan and Zhang, Tingting and Gu, Quanquan}, booktitle = {Proceedings of the 20th International Conference on Artificial Intelligence and Statistics}, pages = {923--932}, year = {2017}, editor = {Singh, Aarti and Zhu, Jerry}, volume = {54}, series = {Proceedings of Machine Learning Research}, month = {20--22 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v54/xu17b/xu17b.pdf}, url = {https://proceedings.mlr.press/v54/xu17b.html}, abstract = {We study the sparse tensor-variate Gaussian graphical model (STGGM), where each way of the tensor follows a multivariate normal distribution whose precision matrix has sparse structures. In order to estimate the precision matrices, we propose a sparsity constrained maximum likelihood estimator. However, due to the complex structure of the tensor-variate GGMs, the likelihood based estimator is non-convex, which poses great challenges for both computation and theoretical analysis. In order to address these challenges, we propose an efficient alternating gradient descent algorithm to solve this estimator, and prove that, under certain conditions on the initial estimator, our algorithm is guaranteed to linearly converge to the unknown precision matrices up to the optimal statistical error. Experiments on both synthetic data and real world brain imaging data corroborate our theory.} }
Endnote
%0 Conference Paper %T Efficient Algorithm for Sparse Tensor-variate Gaussian Graphical Models via Gradient Descent %A Pan Xu %A Tingting Zhang %A Quanquan Gu %B Proceedings of the 20th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2017 %E Aarti Singh %E Jerry Zhu %F pmlr-v54-xu17b %I PMLR %P 923--932 %U https://proceedings.mlr.press/v54/xu17b.html %V 54 %X We study the sparse tensor-variate Gaussian graphical model (STGGM), where each way of the tensor follows a multivariate normal distribution whose precision matrix has sparse structures. In order to estimate the precision matrices, we propose a sparsity constrained maximum likelihood estimator. However, due to the complex structure of the tensor-variate GGMs, the likelihood based estimator is non-convex, which poses great challenges for both computation and theoretical analysis. In order to address these challenges, we propose an efficient alternating gradient descent algorithm to solve this estimator, and prove that, under certain conditions on the initial estimator, our algorithm is guaranteed to linearly converge to the unknown precision matrices up to the optimal statistical error. Experiments on both synthetic data and real world brain imaging data corroborate our theory.
APA
Xu, P., Zhang, T. & Gu, Q.. (2017). Efficient Algorithm for Sparse Tensor-variate Gaussian Graphical Models via Gradient Descent. Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 54:923-932 Available from https://proceedings.mlr.press/v54/xu17b.html.

Related Material