Differentially Private Assouad, Fano, and Le Cam

Jayadev Acharya, Ziteng Sun, Huanyu Zhang
Proceedings of the 32nd International Conference on Algorithmic Learning Theory, PMLR 132:48-78, 2021.

Abstract

Le Cam’s method, Fano’s inequality, and Assouad’s lemma are three widely used techniques to prove lower bounds for statistical estimation tasks. We propose their analogues under central differential privacy. Our results are simple, easy to apply and we use them to establish sample complexity bounds in several estimation tasks. \\{We} establish the optimal sample complexity of discrete distribution estimation under total variation distance and $\ell_2$ distance. We also provide lower bounds for several other distribution classes, including product distributions and Gaussian mixtures that are tight up to logarithmic factors. The technical component of our paper relates coupling between distributions to the sample complexity of estimation under differential privacy.

Cite this Paper


BibTeX
@InProceedings{pmlr-v132-acharya21a, title = {Differentially Private {A}ssouad, {F}ano, and {L}e {C}am}, author = {Acharya, Jayadev and Sun, Ziteng and Zhang, Huanyu}, booktitle = {Proceedings of the 32nd International Conference on Algorithmic Learning Theory}, pages = {48--78}, year = {2021}, editor = {Feldman, Vitaly and Ligett, Katrina and Sabato, Sivan}, volume = {132}, series = {Proceedings of Machine Learning Research}, month = {16--19 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v132/acharya21a/acharya21a.pdf}, url = {https://proceedings.mlr.press/v132/acharya21a.html}, abstract = {Le Cam’s method, Fano’s inequality, and Assouad’s lemma are three widely used techniques to prove lower bounds for statistical estimation tasks. We propose their analogues under central differential privacy. Our results are simple, easy to apply and we use them to establish sample complexity bounds in several estimation tasks. \\{We} establish the optimal sample complexity of discrete distribution estimation under total variation distance and $\ell_2$ distance. We also provide lower bounds for several other distribution classes, including product distributions and Gaussian mixtures that are tight up to logarithmic factors. The technical component of our paper relates coupling between distributions to the sample complexity of estimation under differential privacy. } }
Endnote
%0 Conference Paper %T Differentially Private Assouad, Fano, and Le Cam %A Jayadev Acharya %A Ziteng Sun %A Huanyu Zhang %B Proceedings of the 32nd International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2021 %E Vitaly Feldman %E Katrina Ligett %E Sivan Sabato %F pmlr-v132-acharya21a %I PMLR %P 48--78 %U https://proceedings.mlr.press/v132/acharya21a.html %V 132 %X Le Cam’s method, Fano’s inequality, and Assouad’s lemma are three widely used techniques to prove lower bounds for statistical estimation tasks. We propose their analogues under central differential privacy. Our results are simple, easy to apply and we use them to establish sample complexity bounds in several estimation tasks. \\{We} establish the optimal sample complexity of discrete distribution estimation under total variation distance and $\ell_2$ distance. We also provide lower bounds for several other distribution classes, including product distributions and Gaussian mixtures that are tight up to logarithmic factors. The technical component of our paper relates coupling between distributions to the sample complexity of estimation under differential privacy.
APA
Acharya, J., Sun, Z. & Zhang, H.. (2021). Differentially Private Assouad, Fano, and Le Cam. Proceedings of the 32nd International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 132:48-78 Available from https://proceedings.mlr.press/v132/acharya21a.html.

Related Material