Poission Subsampled Rényi Differential Privacy

Yuqing Zhu, Yu-Xiang Wang
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:7634-7642, 2019.

Abstract

We consider the problem of privacy-amplification by under the Renyi Differential Privacy framework. This is the main technique underlying the moments accountants (Abadi et al., 2016) for differentially private deep learning. Unlike previous attempts on this problem which deals with Sampling with Replacement, we consider the Poisson subsampling scheme which selects each data point independently with a coin toss. This allows us to significantly simplify and tighten the bounds for the RDP of subsampled mechanisms and derive numerically stable approximation schemes. In particular, for subsampled Gaussian mechanism and subsampled Laplace mechanism, we prove an analytical formula of their RDP that exactly matches the lower bound. The result is the first of its kind and we numerically demonstrate an order of magnitude improvement in the privacy-utility tradeoff.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-zhu19c, title = {Poission Subsampled Rényi Differential Privacy}, author = {Zhu, Yuqing and Wang, Yu-Xiang}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {7634--7642}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/zhu19c/zhu19c.pdf}, url = {https://proceedings.mlr.press/v97/zhu19c.html}, abstract = {We consider the problem of privacy-amplification by under the Renyi Differential Privacy framework. This is the main technique underlying the moments accountants (Abadi et al., 2016) for differentially private deep learning. Unlike previous attempts on this problem which deals with Sampling with Replacement, we consider the Poisson subsampling scheme which selects each data point independently with a coin toss. This allows us to significantly simplify and tighten the bounds for the RDP of subsampled mechanisms and derive numerically stable approximation schemes. In particular, for subsampled Gaussian mechanism and subsampled Laplace mechanism, we prove an analytical formula of their RDP that exactly matches the lower bound. The result is the first of its kind and we numerically demonstrate an order of magnitude improvement in the privacy-utility tradeoff.} }
Endnote
%0 Conference Paper %T Poission Subsampled Rényi Differential Privacy %A Yuqing Zhu %A Yu-Xiang Wang %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-zhu19c %I PMLR %P 7634--7642 %U https://proceedings.mlr.press/v97/zhu19c.html %V 97 %X We consider the problem of privacy-amplification by under the Renyi Differential Privacy framework. This is the main technique underlying the moments accountants (Abadi et al., 2016) for differentially private deep learning. Unlike previous attempts on this problem which deals with Sampling with Replacement, we consider the Poisson subsampling scheme which selects each data point independently with a coin toss. This allows us to significantly simplify and tighten the bounds for the RDP of subsampled mechanisms and derive numerically stable approximation schemes. In particular, for subsampled Gaussian mechanism and subsampled Laplace mechanism, we prove an analytical formula of their RDP that exactly matches the lower bound. The result is the first of its kind and we numerically demonstrate an order of magnitude improvement in the privacy-utility tradeoff.
APA
Zhu, Y. & Wang, Y.. (2019). Poission Subsampled Rényi Differential Privacy. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:7634-7642 Available from https://proceedings.mlr.press/v97/zhu19c.html.

Related Material