Efficient Algorithm for Sparse Tensorvariate Gaussian Graphical Models via Gradient Descent
[edit]
Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, PMLR 54:923932, 2017.
Abstract
We study the sparse tensorvariate Gaussian graphical model (STGGM), where each way of the tensor follows a multivariate normal distribution whose precision matrix has sparse structures. In order to estimate the precision matrices, we propose a sparsity constrained maximum likelihood estimator. However, due to the complex structure of the tensorvariate GGMs, the likelihood based estimator is nonconvex, which poses great challenges for both computation and theoretical analysis. In order to address these challenges, we propose an efficient alternating gradient descent algorithm to solve this estimator, and prove that, under certain conditions on the initial estimator, our algorithm is guaranteed to linearly converge to the unknown precision matrices up to the optimal statistical error. Experiments on both synthetic data and real world brain imaging data corroborate our theory.
Related Material


