Optimizing Noise Distributions for Differential Privacy

Atefeh Gilani, Juan Felipe Gomez, Shahab Asoodeh, Flavio Calmon, Oliver Kosut, Lalitha Sankar
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:19505-19522, 2025.

Abstract

We propose a unified optimization framework for designing continuous and discrete noise distributions that ensure differential privacy (DP) by minimizing Rényi DP, a variant of DP, under a cost constraint. Rényi DP has the advantage that by considering different values of the Rényi parameter $\alpha$, we can tailor our optimization for any number of compositions. To solve the optimization problem, we reduce it to a finite-dimensional convex formulation and perform preconditioned gradient descent. The resulting noise distributions are then compared to their Gaussian and Laplace counterparts. Numerical results demonstrate that our optimized distributions are consistently better, with significant improvements in $(\varepsilon, \delta)$-DP guarantees in the moderate composition regimes, compared to Gaussian and Laplace distributions with the same variance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-gilani25a, title = {Optimizing Noise Distributions for Differential Privacy}, author = {Gilani, Atefeh and Gomez, Juan Felipe and Asoodeh, Shahab and Calmon, Flavio and Kosut, Oliver and Sankar, Lalitha}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {19505--19522}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/gilani25a/gilani25a.pdf}, url = {https://proceedings.mlr.press/v267/gilani25a.html}, abstract = {We propose a unified optimization framework for designing continuous and discrete noise distributions that ensure differential privacy (DP) by minimizing Rényi DP, a variant of DP, under a cost constraint. Rényi DP has the advantage that by considering different values of the Rényi parameter $\alpha$, we can tailor our optimization for any number of compositions. To solve the optimization problem, we reduce it to a finite-dimensional convex formulation and perform preconditioned gradient descent. The resulting noise distributions are then compared to their Gaussian and Laplace counterparts. Numerical results demonstrate that our optimized distributions are consistently better, with significant improvements in $(\varepsilon, \delta)$-DP guarantees in the moderate composition regimes, compared to Gaussian and Laplace distributions with the same variance.} }
Endnote
%0 Conference Paper %T Optimizing Noise Distributions for Differential Privacy %A Atefeh Gilani %A Juan Felipe Gomez %A Shahab Asoodeh %A Flavio Calmon %A Oliver Kosut %A Lalitha Sankar %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-gilani25a %I PMLR %P 19505--19522 %U https://proceedings.mlr.press/v267/gilani25a.html %V 267 %X We propose a unified optimization framework for designing continuous and discrete noise distributions that ensure differential privacy (DP) by minimizing Rényi DP, a variant of DP, under a cost constraint. Rényi DP has the advantage that by considering different values of the Rényi parameter $\alpha$, we can tailor our optimization for any number of compositions. To solve the optimization problem, we reduce it to a finite-dimensional convex formulation and perform preconditioned gradient descent. The resulting noise distributions are then compared to their Gaussian and Laplace counterparts. Numerical results demonstrate that our optimized distributions are consistently better, with significant improvements in $(\varepsilon, \delta)$-DP guarantees in the moderate composition regimes, compared to Gaussian and Laplace distributions with the same variance.
APA
Gilani, A., Gomez, J.F., Asoodeh, S., Calmon, F., Kosut, O. & Sankar, L.. (2025). Optimizing Noise Distributions for Differential Privacy. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:19505-19522 Available from https://proceedings.mlr.press/v267/gilani25a.html.

Related Material