Lower Bounds for Locally Private Estimation via Communication Complexity

John Duchi, Ryan Rogers
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:1161-1191, 2019.

Abstract

We develop lower bounds for estimation under local privacy constraints—including differential privacy and its relaxations to approximate or Rényi differential privacy—by showing an equivalence between private estimation and communication-restricted estimation problems. Our results apply to arbitrarily interactive privacy mechanisms, and they also give sharp lower bounds for all levels of differential privacy protections, that is, privacy mechanisms with privacy levels $\varepsilon \in [0, \infty)$. As a particular consequence of our results, we show that the minimax mean-squared error for estimating the mean of a bounded or Gaussian random vector in $d$ dimensions scales as $\frac{d}{n} \cdot \frac{d}{ \min\{\varepsilon, \varepsilon^2\}}$.

Cite this Paper


BibTeX
@InProceedings{pmlr-v99-duchi19a, title = {Lower Bounds for Locally Private Estimation via Communication Complexity}, author = {Duchi, John and Rogers, Ryan}, booktitle = {Proceedings of the Thirty-Second Conference on Learning Theory}, pages = {1161--1191}, year = {2019}, editor = {Beygelzimer, Alina and Hsu, Daniel}, volume = {99}, series = {Proceedings of Machine Learning Research}, month = {25--28 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v99/duchi19a/duchi19a.pdf}, url = {https://proceedings.mlr.press/v99/duchi19a.html}, abstract = {We develop lower bounds for estimation under local privacy constraints—including differential privacy and its relaxations to approximate or Rényi differential privacy—by showing an equivalence between private estimation and communication-restricted estimation problems. Our results apply to arbitrarily interactive privacy mechanisms, and they also give sharp lower bounds for all levels of differential privacy protections, that is, privacy mechanisms with privacy levels $\varepsilon \in [0, \infty)$. As a particular consequence of our results, we show that the minimax mean-squared error for estimating the mean of a bounded or Gaussian random vector in $d$ dimensions scales as $\frac{d}{n} \cdot \frac{d}{ \min\{\varepsilon, \varepsilon^2\}}$.} }
Endnote
%0 Conference Paper %T Lower Bounds for Locally Private Estimation via Communication Complexity %A John Duchi %A Ryan Rogers %B Proceedings of the Thirty-Second Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2019 %E Alina Beygelzimer %E Daniel Hsu %F pmlr-v99-duchi19a %I PMLR %P 1161--1191 %U https://proceedings.mlr.press/v99/duchi19a.html %V 99 %X We develop lower bounds for estimation under local privacy constraints—including differential privacy and its relaxations to approximate or Rényi differential privacy—by showing an equivalence between private estimation and communication-restricted estimation problems. Our results apply to arbitrarily interactive privacy mechanisms, and they also give sharp lower bounds for all levels of differential privacy protections, that is, privacy mechanisms with privacy levels $\varepsilon \in [0, \infty)$. As a particular consequence of our results, we show that the minimax mean-squared error for estimating the mean of a bounded or Gaussian random vector in $d$ dimensions scales as $\frac{d}{n} \cdot \frac{d}{ \min\{\varepsilon, \varepsilon^2\}}$.
APA
Duchi, J. & Rogers, R.. (2019). Lower Bounds for Locally Private Estimation via Communication Complexity. Proceedings of the Thirty-Second Conference on Learning Theory, in Proceedings of Machine Learning Research 99:1161-1191 Available from https://proceedings.mlr.press/v99/duchi19a.html.

Related Material