[edit]
Agnostic Private Density Estimation for GMMs via List Global Stability
Proceedings of The 36th International Conference on Algorithmic Learning Theory, PMLR 272:41-66, 2025.
Abstract
We consider the problem of private density estimation for mixtures of unbounded high-dimensional Gaussians in the agnostic setting. We prove the first upper bound on the sample complexity of this problem. Previously, private learnability of high dimensional GMMs was only known in the realizable setting Afzali et al. (2024). To prove our result, we exploit the notion of \textit{list global stability} Ghazi et al. (2021) that was originally introduced in the context of supervised learning. We define an agnostic variant of this definition, showing that its existence is sufficient for agnostic private density estimation. We then construct an agnostic list globally stable learner for GMMs.