Lower Bounds for Locally Private Estimation via Communication Complexity


John Duchi, Ryan Rogers ;
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:1161-1191, 2019.


We develop lower bounds for estimation under local privacy constraints—including differential privacy and its relaxations to approximate or Rényi differential privacy—by showing an equivalence between private estimation and communication-restricted estimation problems. Our results apply to arbitrarily interactive privacy mechanisms, and they also give sharp lower bounds for all levels of differential privacy protections, that is, privacy mechanisms with privacy levels $\varepsilon \in [0, \infty)$. As a particular consequence of our results, we show that the minimax mean-squared error for estimating the mean of a bounded or Gaussian random vector in $d$ dimensions scales as $\frac{d}{n} \cdot \frac{d}{ \min\{\varepsilon, \varepsilon^2\}}$.

Related Material