[edit]
Lower Bounds for Private Estimation of Gaussian Covariance Matrices under All Reasonable Parameter Regimes
Proceedings of Thirty Eighth Conference on Learning Theory, PMLR 291:4640-4667, 2025.
Abstract
One of the most basic problems in statistics is estimating the covariance matrix of a Gaussian distribution. Over the past decade, researchers have studied the efficiency of covariance estimation in the setting of differential privacy. The goal is to minimize the number of samples needed to achieve the desired accuracy and privacy guarantees. We prove lower bounds on the number of samples needed to privately estimate the covariance matrix of a Gaussian distribution. Our bounds match existing upper bounds in the widest known setting of parameters. Our analysis can be seen as a fingerprinting argument, one of the main techniques used to prove lower bounds in differential privacy. Most fingerprinting arguments rely on results analogous to the celebrated Stein’s identity from probability theory. We use a matrix extension of this identity known as the Stein-Haff identity.