[edit]
Universally Instance-Optimal Mechanisms for Private Statistical Estimation
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:221-259, 2024.
Abstract
We consider the problem of instance-optimal statistical estimation under the constraint of differential privacy where mechanisms must adapt to the difficulty of the input dataset. We prove a new instance specific lower bound using a new divergence and show it characterizes the local minimax optimal rates for private statistical estimation. We propose two new mechanisms that are universally instance-optimal for general estimation problems up to logarithmic factors. Our first mechanism, the total variation mechanism, builds on the exponential mechanism with stable approximations of the total variation distance, and is universally instance-optimal in the high privacy regime $\epsilon \leq 1/\sqrt{n}$. Our second mechanism, the T-mechanism, is based on the T-estimator framework (Birg{é}, 2006) using the clipped log likelihood ratio as a stable test: it attains instance-optimal rates for any $\epsilon \leq 1$ up to logarithmic factors. Finally, we study the implications of our results to robust statistical estimation, and show that our algorithms are universally optimal for this problem, characterizing the optimal minimax rates for robust statistical estimation.