Robust Estimation of Discrete Distributions under Local Differential Privacy

Julien Chhor, Flore Sentenac
Proceedings of The 34th International Conference on Algorithmic Learning Theory, PMLR 201:411-446, 2023.

Abstract

Although robust learning and local differential privacy are both widely studied fields of research, combining the two settings is just starting to be explored. We consider the problem of estimating a discrete distribution in total variation from $n$ contaminated data batches under a local differential privacy constraint. A fraction $1-\alpha$ of the batches contain $k$ i.i.d. samples drawn from a discrete distribution $p$ over $d$ elements. To protect the users’ privacy, each of the samples is privatized using an $\epsilon$-locally differentially private mechanism. The remaining $\alpha n $ batches are an adversarial contamination. The minimax rate of estimation under contamination alone, with no privacy, is known to be $\alpha/\sqrt{k}+\sqrt{d/kn}$. Under the privacy constraint alone, the minimax rate of estimation is $\sqrt{d^2/\epsilon^2 kn}$. We show, up to a $\sqrt{\log(1/\alpha)}$ factor, that combining the two constraints leads to a minimax estimation rate of $\alpha\sqrt{d/\epsilon^2 k}+\sqrt{d^2/\epsilon^2 kn}$, larger than the sum of the two separate rates. We provide a polynomial-time algorithm achieving this bound, as well as a matching information theoretic lower bound.

Cite this Paper


BibTeX
@InProceedings{pmlr-v201-chhor23a, title = {Robust Estimation of Discrete Distributions under Local Differential Privacy}, author = {Chhor, Julien and Sentenac, Flore}, booktitle = {Proceedings of The 34th International Conference on Algorithmic Learning Theory}, pages = {411--446}, year = {2023}, editor = {Agrawal, Shipra and Orabona, Francesco}, volume = {201}, series = {Proceedings of Machine Learning Research}, month = {20 Feb--23 Feb}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v201/chhor23a/chhor23a.pdf}, url = {https://proceedings.mlr.press/v201/chhor23a.html}, abstract = {Although robust learning and local differential privacy are both widely studied fields of research, combining the two settings is just starting to be explored. We consider the problem of estimating a discrete distribution in total variation from $n$ contaminated data batches under a local differential privacy constraint. A fraction $1-\alpha$ of the batches contain $k$ i.i.d. samples drawn from a discrete distribution $p$ over $d$ elements. To protect the users’ privacy, each of the samples is privatized using an $\epsilon$-locally differentially private mechanism. The remaining $\alpha n $ batches are an adversarial contamination. The minimax rate of estimation under contamination alone, with no privacy, is known to be $\alpha/\sqrt{k}+\sqrt{d/kn}$. Under the privacy constraint alone, the minimax rate of estimation is $\sqrt{d^2/\epsilon^2 kn}$. We show, up to a $\sqrt{\log(1/\alpha)}$ factor, that combining the two constraints leads to a minimax estimation rate of $\alpha\sqrt{d/\epsilon^2 k}+\sqrt{d^2/\epsilon^2 kn}$, larger than the sum of the two separate rates. We provide a polynomial-time algorithm achieving this bound, as well as a matching information theoretic lower bound. } }
Endnote
%0 Conference Paper %T Robust Estimation of Discrete Distributions under Local Differential Privacy %A Julien Chhor %A Flore Sentenac %B Proceedings of The 34th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2023 %E Shipra Agrawal %E Francesco Orabona %F pmlr-v201-chhor23a %I PMLR %P 411--446 %U https://proceedings.mlr.press/v201/chhor23a.html %V 201 %X Although robust learning and local differential privacy are both widely studied fields of research, combining the two settings is just starting to be explored. We consider the problem of estimating a discrete distribution in total variation from $n$ contaminated data batches under a local differential privacy constraint. A fraction $1-\alpha$ of the batches contain $k$ i.i.d. samples drawn from a discrete distribution $p$ over $d$ elements. To protect the users’ privacy, each of the samples is privatized using an $\epsilon$-locally differentially private mechanism. The remaining $\alpha n $ batches are an adversarial contamination. The minimax rate of estimation under contamination alone, with no privacy, is known to be $\alpha/\sqrt{k}+\sqrt{d/kn}$. Under the privacy constraint alone, the minimax rate of estimation is $\sqrt{d^2/\epsilon^2 kn}$. We show, up to a $\sqrt{\log(1/\alpha)}$ factor, that combining the two constraints leads to a minimax estimation rate of $\alpha\sqrt{d/\epsilon^2 k}+\sqrt{d^2/\epsilon^2 kn}$, larger than the sum of the two separate rates. We provide a polynomial-time algorithm achieving this bound, as well as a matching information theoretic lower bound.
APA
Chhor, J. & Sentenac, F.. (2023). Robust Estimation of Discrete Distributions under Local Differential Privacy. Proceedings of The 34th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 201:411-446 Available from https://proceedings.mlr.press/v201/chhor23a.html.

Related Material