Agnostic Private Density Estimation for GMMs via List Global Stability

Mohammad Afzali, Hassan Ashtiani, Christopher Liaw
Proceedings of The 36th International Conference on Algorithmic Learning Theory, PMLR 272:41-66, 2025.

Abstract

We consider the problem of private density estimation for mixtures of unbounded high-dimensional Gaussians in the agnostic setting. We prove the first upper bound on the sample complexity of this problem. Previously, private learnability of high dimensional GMMs was only known in the realizable setting Afzali et al. (2024). To prove our result, we exploit the notion of \textit{list global stability} Ghazi et al. (2021) that was originally introduced in the context of supervised learning. We define an agnostic variant of this definition, showing that its existence is sufficient for agnostic private density estimation. We then construct an agnostic list globally stable learner for GMMs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v272-afzali25a, title = {Agnostic Private Density Estimation for GMMs via List Global Stability}, author = {Afzali, Mohammad and Ashtiani, Hassan and Liaw, Christopher}, booktitle = {Proceedings of The 36th International Conference on Algorithmic Learning Theory}, pages = {41--66}, year = {2025}, editor = {Kamath, Gautam and Loh, Po-Ling}, volume = {272}, series = {Proceedings of Machine Learning Research}, month = {24--27 Feb}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v272/main/assets/afzali25a/afzali25a.pdf}, url = {https://proceedings.mlr.press/v272/afzali25a.html}, abstract = {We consider the problem of private density estimation for mixtures of unbounded high-dimensional Gaussians in the agnostic setting. We prove the first upper bound on the sample complexity of this problem. Previously, private learnability of high dimensional GMMs was only known in the realizable setting Afzali et al. (2024). To prove our result, we exploit the notion of \textit{list global stability} Ghazi et al. (2021) that was originally introduced in the context of supervised learning. We define an agnostic variant of this definition, showing that its existence is sufficient for agnostic private density estimation. We then construct an agnostic list globally stable learner for GMMs.} }
Endnote
%0 Conference Paper %T Agnostic Private Density Estimation for GMMs via List Global Stability %A Mohammad Afzali %A Hassan Ashtiani %A Christopher Liaw %B Proceedings of The 36th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2025 %E Gautam Kamath %E Po-Ling Loh %F pmlr-v272-afzali25a %I PMLR %P 41--66 %U https://proceedings.mlr.press/v272/afzali25a.html %V 272 %X We consider the problem of private density estimation for mixtures of unbounded high-dimensional Gaussians in the agnostic setting. We prove the first upper bound on the sample complexity of this problem. Previously, private learnability of high dimensional GMMs was only known in the realizable setting Afzali et al. (2024). To prove our result, we exploit the notion of \textit{list global stability} Ghazi et al. (2021) that was originally introduced in the context of supervised learning. We define an agnostic variant of this definition, showing that its existence is sufficient for agnostic private density estimation. We then construct an agnostic list globally stable learner for GMMs.
APA
Afzali, M., Ashtiani, H. & Liaw, C.. (2025). Agnostic Private Density Estimation for GMMs via List Global Stability. Proceedings of The 36th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 272:41-66 Available from https://proceedings.mlr.press/v272/afzali25a.html.

Related Material