[edit]
Some Constructions of Private, Efficient, and Optimal $K$-Norm and Elliptic Gaussian Noise
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:2723-2766, 2024.
Abstract
Differentially private computation often begins with a bound on some $d$-dimensional statistic’s $\ell_p$ sensitivity. For pure differential privacy, the $K$-norm mechanism can improve on this approach using a norm tailored to the statistic’s sensitivity space. Writing down a closed-form description of this optimal norm is often straightforward. However, running the $K$-norm mechanism reduces to uniformly sampling the norm’s unit ball; this ball is a $d$-dimensional convex body, so general sampling algorithms can be slow. Turning to concentrated differential privacy, elliptic Gaussian noise offers similar improvement over spherical Gaussian noise. Once the shape of this ellipse is determined, sampling is easy; however, identifying the best such shape may be hard. This paper solves both problems for the simple statistics of sum, count, and vote. For each statistic, we provide a sampler for the optimal $K$-norm mechanism that runs in time $\tilde O(d^2)$ and derive a closed-form expression for the optimal shape of elliptic Gaussian noise. The resulting algorithms all yield meaningful accuracy improvements while remaining fast and simple enough to be practical. More broadly, we suggest that problem-specific sensitivity space analysis may be an overlooked tool for private additive noise.