Dimension-free Structured Covariance Estimation

Nikita Puchkin, Maxim Rakhuba
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:4276-4306, 2024.

Abstract

Given a sample of i.i.d. high-dimensional centered random vectors, we consider a problem of estimation of their covariance matrix $\Sigma$ with an additional assumption that $\Sigma$ can be represented as a sum of a few Kronecker products of smaller matrices. Under mild conditions, we derive the first non-asymptotic dimension-free high-probability bound on the Frobenius distance between $\Sigma$ and a widely used penalized permuted least squares estimate. Because of the hidden structure, the established rate of convergence is faster than in the standard covariance estimation problem.

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-puchkin24a, title = {Dimension-free Structured Covariance Estimation}, author = {Puchkin, Nikita and Rakhuba, Maxim}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {4276--4306}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/puchkin24a/puchkin24a.pdf}, url = {https://proceedings.mlr.press/v247/puchkin24a.html}, abstract = {Given a sample of i.i.d. high-dimensional centered random vectors, we consider a problem of estimation of their covariance matrix $\Sigma$ with an additional assumption that $\Sigma$ can be represented as a sum of a few Kronecker products of smaller matrices. Under mild conditions, we derive the first non-asymptotic dimension-free high-probability bound on the Frobenius distance between $\Sigma$ and a widely used penalized permuted least squares estimate. Because of the hidden structure, the established rate of convergence is faster than in the standard covariance estimation problem.} }
Endnote
%0 Conference Paper %T Dimension-free Structured Covariance Estimation %A Nikita Puchkin %A Maxim Rakhuba %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-puchkin24a %I PMLR %P 4276--4306 %U https://proceedings.mlr.press/v247/puchkin24a.html %V 247 %X Given a sample of i.i.d. high-dimensional centered random vectors, we consider a problem of estimation of their covariance matrix $\Sigma$ with an additional assumption that $\Sigma$ can be represented as a sum of a few Kronecker products of smaller matrices. Under mild conditions, we derive the first non-asymptotic dimension-free high-probability bound on the Frobenius distance between $\Sigma$ and a widely used penalized permuted least squares estimate. Because of the hidden structure, the established rate of convergence is faster than in the standard covariance estimation problem.
APA
Puchkin, N. & Rakhuba, M.. (2024). Dimension-free Structured Covariance Estimation. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:4276-4306 Available from https://proceedings.mlr.press/v247/puchkin24a.html.

Related Material