Optimal Compression of Locally Differentially Private Mechanisms

Abhin Shah, Wei-Ning Chen, Johannes Ballé, Peter Kairouz, Lucas Theis
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:7680-7723, 2022.

Abstract

Compressing the output of $\epsilon$-locally differentially private (LDP) randomizers naively leads to suboptimal utility. In this work, we demonstrate the benefits of using schemes that jointly compress and privatize the data using shared randomness. In particular, we investigate a family of schemes based on Minimal Random Coding (Havasi et al., 2019) and prove that they offer optimal privacy-accuracy-communication tradeoffs. Our theoretical and empirical findings show that our approach can compress PrivUnit (Bhowmick et al., 2018) and Subset Selection (Ye et al., 2018), the best known LDP algorithms for mean and frequency estimation, to the order of $\epsilon$ bits of communication while preserving their privacy and accuracy guarantees.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-shah22b, title = { Optimal Compression of Locally Differentially Private Mechanisms }, author = {Shah, Abhin and Chen, Wei-Ning and Ball\'e, Johannes and Kairouz, Peter and Theis, Lucas}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {7680--7723}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/shah22b/shah22b.pdf}, url = {https://proceedings.mlr.press/v151/shah22b.html}, abstract = { Compressing the output of $\epsilon$-locally differentially private (LDP) randomizers naively leads to suboptimal utility. In this work, we demonstrate the benefits of using schemes that jointly compress and privatize the data using shared randomness. In particular, we investigate a family of schemes based on Minimal Random Coding (Havasi et al., 2019) and prove that they offer optimal privacy-accuracy-communication tradeoffs. Our theoretical and empirical findings show that our approach can compress PrivUnit (Bhowmick et al., 2018) and Subset Selection (Ye et al., 2018), the best known LDP algorithms for mean and frequency estimation, to the order of $\epsilon$ bits of communication while preserving their privacy and accuracy guarantees. } }
Endnote
%0 Conference Paper %T Optimal Compression of Locally Differentially Private Mechanisms %A Abhin Shah %A Wei-Ning Chen %A Johannes Ballé %A Peter Kairouz %A Lucas Theis %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-shah22b %I PMLR %P 7680--7723 %U https://proceedings.mlr.press/v151/shah22b.html %V 151 %X Compressing the output of $\epsilon$-locally differentially private (LDP) randomizers naively leads to suboptimal utility. In this work, we demonstrate the benefits of using schemes that jointly compress and privatize the data using shared randomness. In particular, we investigate a family of schemes based on Minimal Random Coding (Havasi et al., 2019) and prove that they offer optimal privacy-accuracy-communication tradeoffs. Our theoretical and empirical findings show that our approach can compress PrivUnit (Bhowmick et al., 2018) and Subset Selection (Ye et al., 2018), the best known LDP algorithms for mean and frequency estimation, to the order of $\epsilon$ bits of communication while preserving their privacy and accuracy guarantees.
APA
Shah, A., Chen, W., Ballé, J., Kairouz, P. & Theis, L.. (2022). Optimal Compression of Locally Differentially Private Mechanisms . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:7680-7723 Available from https://proceedings.mlr.press/v151/shah22b.html.

Related Material