Universally Instance-Optimal Mechanisms for Private Statistical Estimation

Hilal Asi, John C. Duchi, Saminul Haque, Zewei Li, Feng Ruan
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:221-259, 2024.

Abstract

We consider the problem of instance-optimal statistical estimation under the constraint of differential privacy where mechanisms must adapt to the difficulty of the input dataset. We prove a new instance specific lower bound using a new divergence and show it characterizes the local minimax optimal rates for private statistical estimation. We propose two new mechanisms that are universally instance-optimal for general estimation problems up to logarithmic factors. Our first mechanism, the total variation mechanism, builds on the exponential mechanism with stable approximations of the total variation distance, and is universally instance-optimal in the high privacy regime $\epsilon \leq 1/\sqrt{n}$. Our second mechanism, the T-mechanism, is based on the T-estimator framework (Birg{é}, 2006) using the clipped log likelihood ratio as a stable test: it attains instance-optimal rates for any $\epsilon \leq 1$ up to logarithmic factors. Finally, we study the implications of our results to robust statistical estimation, and show that our algorithms are universally optimal for this problem, characterizing the optimal minimax rates for robust statistical estimation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-asi24a, title = {Universally Instance-Optimal Mechanisms for Private Statistical Estimation}, author = {Asi, Hilal and Duchi, John C. and Haque, Saminul and Li, Zewei and Ruan, Feng}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {221--259}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/asi24a/asi24a.pdf}, url = {https://proceedings.mlr.press/v247/asi24a.html}, abstract = { We consider the problem of instance-optimal statistical estimation under the constraint of differential privacy where mechanisms must adapt to the difficulty of the input dataset. We prove a new instance specific lower bound using a new divergence and show it characterizes the local minimax optimal rates for private statistical estimation. We propose two new mechanisms that are universally instance-optimal for general estimation problems up to logarithmic factors. Our first mechanism, the total variation mechanism, builds on the exponential mechanism with stable approximations of the total variation distance, and is universally instance-optimal in the high privacy regime $\epsilon \leq 1/\sqrt{n}$. Our second mechanism, the T-mechanism, is based on the T-estimator framework (Birg{é}, 2006) using the clipped log likelihood ratio as a stable test: it attains instance-optimal rates for any $\epsilon \leq 1$ up to logarithmic factors. Finally, we study the implications of our results to robust statistical estimation, and show that our algorithms are universally optimal for this problem, characterizing the optimal minimax rates for robust statistical estimation. } }
Endnote
%0 Conference Paper %T Universally Instance-Optimal Mechanisms for Private Statistical Estimation %A Hilal Asi %A John C. Duchi %A Saminul Haque %A Zewei Li %A Feng Ruan %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-asi24a %I PMLR %P 221--259 %U https://proceedings.mlr.press/v247/asi24a.html %V 247 %X We consider the problem of instance-optimal statistical estimation under the constraint of differential privacy where mechanisms must adapt to the difficulty of the input dataset. We prove a new instance specific lower bound using a new divergence and show it characterizes the local minimax optimal rates for private statistical estimation. We propose two new mechanisms that are universally instance-optimal for general estimation problems up to logarithmic factors. Our first mechanism, the total variation mechanism, builds on the exponential mechanism with stable approximations of the total variation distance, and is universally instance-optimal in the high privacy regime $\epsilon \leq 1/\sqrt{n}$. Our second mechanism, the T-mechanism, is based on the T-estimator framework (Birg{é}, 2006) using the clipped log likelihood ratio as a stable test: it attains instance-optimal rates for any $\epsilon \leq 1$ up to logarithmic factors. Finally, we study the implications of our results to robust statistical estimation, and show that our algorithms are universally optimal for this problem, characterizing the optimal minimax rates for robust statistical estimation.
APA
Asi, H., Duchi, J.C., Haque, S., Li, Z. & Ruan, F.. (2024). Universally Instance-Optimal Mechanisms for Private Statistical Estimation. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:221-259 Available from https://proceedings.mlr.press/v247/asi24a.html.

Related Material