Flexible Accuracy for Differential Privacy

Aman Bansal, Rahul Chunduru, Deepesh Data, Manoj Prabhakaran
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:3847-3882, 2022.

Abstract

Differential Privacy (DP) has become a gold standard in privacy-preserving data analysis. While it provides one of the most rigorous notions of privacy, there are many settings where its applicability is limited. Our main contribution is in augmenting differential privacy with Flexible Accuracy, which allows small distortions in the input (e.g., dropping outliers) before measuring accuracy of the output, allowing one to extend DP mechanisms to high-sensitivity functions. We present mechanisms that can help in achieving this notion for functions that had no meaningful differentially private mechanisms previously. In particular, we illustrate an application to differentially private histograms, which in turn yields mechanisms for revealing the support of a dataset or the extremal values in the data. Analyses of our constructions exploit new versatile composition theorems that facilitate modular design. All the above extensions use our new definitional framework, which is in terms of “lossy Wasserstein distance” – a 2-parameter error measure for distributions. This may be of independent interest.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-bansal22a, title = { Flexible Accuracy for Differential Privacy }, author = {Bansal, Aman and Chunduru, Rahul and Data, Deepesh and Prabhakaran, Manoj}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {3847--3882}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/bansal22a/bansal22a.pdf}, url = {https://proceedings.mlr.press/v151/bansal22a.html}, abstract = { Differential Privacy (DP) has become a gold standard in privacy-preserving data analysis. While it provides one of the most rigorous notions of privacy, there are many settings where its applicability is limited. Our main contribution is in augmenting differential privacy with Flexible Accuracy, which allows small distortions in the input (e.g., dropping outliers) before measuring accuracy of the output, allowing one to extend DP mechanisms to high-sensitivity functions. We present mechanisms that can help in achieving this notion for functions that had no meaningful differentially private mechanisms previously. In particular, we illustrate an application to differentially private histograms, which in turn yields mechanisms for revealing the support of a dataset or the extremal values in the data. Analyses of our constructions exploit new versatile composition theorems that facilitate modular design. All the above extensions use our new definitional framework, which is in terms of “lossy Wasserstein distance” – a 2-parameter error measure for distributions. This may be of independent interest. } }
Endnote
%0 Conference Paper %T Flexible Accuracy for Differential Privacy %A Aman Bansal %A Rahul Chunduru %A Deepesh Data %A Manoj Prabhakaran %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-bansal22a %I PMLR %P 3847--3882 %U https://proceedings.mlr.press/v151/bansal22a.html %V 151 %X Differential Privacy (DP) has become a gold standard in privacy-preserving data analysis. While it provides one of the most rigorous notions of privacy, there are many settings where its applicability is limited. Our main contribution is in augmenting differential privacy with Flexible Accuracy, which allows small distortions in the input (e.g., dropping outliers) before measuring accuracy of the output, allowing one to extend DP mechanisms to high-sensitivity functions. We present mechanisms that can help in achieving this notion for functions that had no meaningful differentially private mechanisms previously. In particular, we illustrate an application to differentially private histograms, which in turn yields mechanisms for revealing the support of a dataset or the extremal values in the data. Analyses of our constructions exploit new versatile composition theorems that facilitate modular design. All the above extensions use our new definitional framework, which is in terms of “lossy Wasserstein distance” – a 2-parameter error measure for distributions. This may be of independent interest.
APA
Bansal, A., Chunduru, R., Data, D. & Prabhakaran, M.. (2022). Flexible Accuracy for Differential Privacy . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:3847-3882 Available from https://proceedings.mlr.press/v151/bansal22a.html.

Related Material