Lower Bounds for Private Estimation of Gaussian Covariance Matrices under All Reasonable Parameter Regimes

Victor S. Portella, Nicholas J. A. Harvey
Proceedings of Thirty Eighth Conference on Learning Theory, PMLR 291:4640-4667, 2025.

Abstract

One of the most basic problems in statistics is estimating the covariance matrix of a Gaussian distribution. Over the past decade, researchers have studied the efficiency of covariance estimation in the setting of differential privacy. The goal is to minimize the number of samples needed to achieve the desired accuracy and privacy guarantees. We prove lower bounds on the number of samples needed to privately estimate the covariance matrix of a Gaussian distribution. Our bounds match existing upper bounds in the widest known setting of parameters. Our analysis can be seen as a fingerprinting argument, one of the main techniques used to prove lower bounds in differential privacy. Most fingerprinting arguments rely on results analogous to the celebrated Stein’s identity from probability theory. We use a matrix extension of this identity known as the Stein-Haff identity.

Cite this Paper


BibTeX
@InProceedings{pmlr-v291-portella25a, title = {Lower Bounds for Private Estimation of Gaussian Covariance Matrices under All Reasonable Parameter Regimes}, author = {Portella, Victor S. and Harvey, Nicholas J. A.}, booktitle = {Proceedings of Thirty Eighth Conference on Learning Theory}, pages = {4640--4667}, year = {2025}, editor = {Haghtalab, Nika and Moitra, Ankur}, volume = {291}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--04 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v291/main/assets/portella25a/portella25a.pdf}, url = {https://proceedings.mlr.press/v291/portella25a.html}, abstract = { One of the most basic problems in statistics is estimating the covariance matrix of a Gaussian distribution. Over the past decade, researchers have studied the efficiency of covariance estimation in the setting of differential privacy. The goal is to minimize the number of samples needed to achieve the desired accuracy and privacy guarantees. We prove lower bounds on the number of samples needed to privately estimate the covariance matrix of a Gaussian distribution. Our bounds match existing upper bounds in the widest known setting of parameters. Our analysis can be seen as a fingerprinting argument, one of the main techniques used to prove lower bounds in differential privacy. Most fingerprinting arguments rely on results analogous to the celebrated Stein’s identity from probability theory. We use a matrix extension of this identity known as the Stein-Haff identity.} }
Endnote
%0 Conference Paper %T Lower Bounds for Private Estimation of Gaussian Covariance Matrices under All Reasonable Parameter Regimes %A Victor S. Portella %A Nicholas J. A. Harvey %B Proceedings of Thirty Eighth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2025 %E Nika Haghtalab %E Ankur Moitra %F pmlr-v291-portella25a %I PMLR %P 4640--4667 %U https://proceedings.mlr.press/v291/portella25a.html %V 291 %X One of the most basic problems in statistics is estimating the covariance matrix of a Gaussian distribution. Over the past decade, researchers have studied the efficiency of covariance estimation in the setting of differential privacy. The goal is to minimize the number of samples needed to achieve the desired accuracy and privacy guarantees. We prove lower bounds on the number of samples needed to privately estimate the covariance matrix of a Gaussian distribution. Our bounds match existing upper bounds in the widest known setting of parameters. Our analysis can be seen as a fingerprinting argument, one of the main techniques used to prove lower bounds in differential privacy. Most fingerprinting arguments rely on results analogous to the celebrated Stein’s identity from probability theory. We use a matrix extension of this identity known as the Stein-Haff identity.
APA
Portella, V.S. & Harvey, N.J.A.. (2025). Lower Bounds for Private Estimation of Gaussian Covariance Matrices under All Reasonable Parameter Regimes. Proceedings of Thirty Eighth Conference on Learning Theory, in Proceedings of Machine Learning Research 291:4640-4667 Available from https://proceedings.mlr.press/v291/portella25a.html.

Related Material