Some Constructions of Private, Efficient, and Optimal $K$-Norm and Elliptic Gaussian Noise

Matthew Joseph, Alexander Yu
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:2723-2766, 2024.

Abstract

Differentially private computation often begins with a bound on some $d$-dimensional statistic’s $\ell_p$ sensitivity. For pure differential privacy, the $K$-norm mechanism can improve on this approach using a norm tailored to the statistic’s sensitivity space. Writing down a closed-form description of this optimal norm is often straightforward. However, running the $K$-norm mechanism reduces to uniformly sampling the norm’s unit ball; this ball is a $d$-dimensional convex body, so general sampling algorithms can be slow. Turning to concentrated differential privacy, elliptic Gaussian noise offers similar improvement over spherical Gaussian noise. Once the shape of this ellipse is determined, sampling is easy; however, identifying the best such shape may be hard. This paper solves both problems for the simple statistics of sum, count, and vote. For each statistic, we provide a sampler for the optimal $K$-norm mechanism that runs in time $\tilde O(d^2)$ and derive a closed-form expression for the optimal shape of elliptic Gaussian noise. The resulting algorithms all yield meaningful accuracy improvements while remaining fast and simple enough to be practical. More broadly, we suggest that problem-specific sensitivity space analysis may be an overlooked tool for private additive noise.

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-joseph24a, title = {Some Constructions of Private, Efficient, and Optimal $K$-Norm and Elliptic Gaussian Noise}, author = {Joseph, Matthew and Yu, Alexander}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {2723--2766}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/joseph24a/joseph24a.pdf}, url = {https://proceedings.mlr.press/v247/joseph24a.html}, abstract = {Differentially private computation often begins with a bound on some $d$-dimensional statistic’s $\ell_p$ sensitivity. For pure differential privacy, the $K$-norm mechanism can improve on this approach using a norm tailored to the statistic’s sensitivity space. Writing down a closed-form description of this optimal norm is often straightforward. However, running the $K$-norm mechanism reduces to uniformly sampling the norm’s unit ball; this ball is a $d$-dimensional convex body, so general sampling algorithms can be slow. Turning to concentrated differential privacy, elliptic Gaussian noise offers similar improvement over spherical Gaussian noise. Once the shape of this ellipse is determined, sampling is easy; however, identifying the best such shape may be hard. This paper solves both problems for the simple statistics of sum, count, and vote. For each statistic, we provide a sampler for the optimal $K$-norm mechanism that runs in time $\tilde O(d^2)$ and derive a closed-form expression for the optimal shape of elliptic Gaussian noise. The resulting algorithms all yield meaningful accuracy improvements while remaining fast and simple enough to be practical. More broadly, we suggest that problem-specific sensitivity space analysis may be an overlooked tool for private additive noise.} }
Endnote
%0 Conference Paper %T Some Constructions of Private, Efficient, and Optimal $K$-Norm and Elliptic Gaussian Noise %A Matthew Joseph %A Alexander Yu %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-joseph24a %I PMLR %P 2723--2766 %U https://proceedings.mlr.press/v247/joseph24a.html %V 247 %X Differentially private computation often begins with a bound on some $d$-dimensional statistic’s $\ell_p$ sensitivity. For pure differential privacy, the $K$-norm mechanism can improve on this approach using a norm tailored to the statistic’s sensitivity space. Writing down a closed-form description of this optimal norm is often straightforward. However, running the $K$-norm mechanism reduces to uniformly sampling the norm’s unit ball; this ball is a $d$-dimensional convex body, so general sampling algorithms can be slow. Turning to concentrated differential privacy, elliptic Gaussian noise offers similar improvement over spherical Gaussian noise. Once the shape of this ellipse is determined, sampling is easy; however, identifying the best such shape may be hard. This paper solves both problems for the simple statistics of sum, count, and vote. For each statistic, we provide a sampler for the optimal $K$-norm mechanism that runs in time $\tilde O(d^2)$ and derive a closed-form expression for the optimal shape of elliptic Gaussian noise. The resulting algorithms all yield meaningful accuracy improvements while remaining fast and simple enough to be practical. More broadly, we suggest that problem-specific sensitivity space analysis may be an overlooked tool for private additive noise.
APA
Joseph, M. & Yu, A.. (2024). Some Constructions of Private, Efficient, and Optimal $K$-Norm and Elliptic Gaussian Noise. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:2723-2766 Available from https://proceedings.mlr.press/v247/joseph24a.html.

Related Material