Minimax Estimation of Laplacian Constrained Precision Matrices

Jiaxi Ying, José Vinícius de Miranda Cardoso, Daniel Palomar
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:3736-3744, 2021.

Abstract

This paper considers the problem of high-dimensional sparse precision matrix estimation under Laplacian constraints. We prove that the Laplacian constraints bring favorable properties for estimation: the Gaussian maximum likelihood estimator exists and is unique almost surely on the basis of one observation, irrespective of the dimension. We establish the optimal rate of convergence under Frobenius norm by the derivation of the minimax lower and upper bounds. The minimax lower bound is obtained by applying Le Cam-Assouad’s method with a novel construction of a subparameter space of multivariate normal distributions. The minimax upper bound is established by designing an adaptive $\ell_1$-norm regularized maximum likelihood estimation method and quantifying the rate of convergence. We prove that the proposed estimator attains the optimal rate of convergence with an overwhelming probability. Numerical experiments demonstrate the effectiveness of the proposed estimator.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-ying21a, title = { Minimax Estimation of Laplacian Constrained Precision Matrices }, author = {Ying, Jiaxi and Vin{\'i}cius de Miranda Cardoso, Jos{\'e} and Palomar, Daniel}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {3736--3744}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/ying21a/ying21a.pdf}, url = {https://proceedings.mlr.press/v130/ying21a.html}, abstract = { This paper considers the problem of high-dimensional sparse precision matrix estimation under Laplacian constraints. We prove that the Laplacian constraints bring favorable properties for estimation: the Gaussian maximum likelihood estimator exists and is unique almost surely on the basis of one observation, irrespective of the dimension. We establish the optimal rate of convergence under Frobenius norm by the derivation of the minimax lower and upper bounds. The minimax lower bound is obtained by applying Le Cam-Assouad’s method with a novel construction of a subparameter space of multivariate normal distributions. The minimax upper bound is established by designing an adaptive $\ell_1$-norm regularized maximum likelihood estimation method and quantifying the rate of convergence. We prove that the proposed estimator attains the optimal rate of convergence with an overwhelming probability. Numerical experiments demonstrate the effectiveness of the proposed estimator. } }
Endnote
%0 Conference Paper %T Minimax Estimation of Laplacian Constrained Precision Matrices %A Jiaxi Ying %A José Vinícius de Miranda Cardoso %A Daniel Palomar %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-ying21a %I PMLR %P 3736--3744 %U https://proceedings.mlr.press/v130/ying21a.html %V 130 %X This paper considers the problem of high-dimensional sparse precision matrix estimation under Laplacian constraints. We prove that the Laplacian constraints bring favorable properties for estimation: the Gaussian maximum likelihood estimator exists and is unique almost surely on the basis of one observation, irrespective of the dimension. We establish the optimal rate of convergence under Frobenius norm by the derivation of the minimax lower and upper bounds. The minimax lower bound is obtained by applying Le Cam-Assouad’s method with a novel construction of a subparameter space of multivariate normal distributions. The minimax upper bound is established by designing an adaptive $\ell_1$-norm regularized maximum likelihood estimation method and quantifying the rate of convergence. We prove that the proposed estimator attains the optimal rate of convergence with an overwhelming probability. Numerical experiments demonstrate the effectiveness of the proposed estimator.
APA
Ying, J., Vinícius de Miranda Cardoso, J. & Palomar, D.. (2021). Minimax Estimation of Laplacian Constrained Precision Matrices . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:3736-3744 Available from https://proceedings.mlr.press/v130/ying21a.html.

Related Material