Lossless Compression of Efficient Private Local Randomizers

Vitaly Feldman, Kunal Talwar
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:3208-3219, 2021.

Abstract

Locally Differentially Private (LDP) Reports are commonly used for collection of statistics and machine learning in the federated setting. In many cases the best known LDP algorithms require sending prohibitively large messages from the client device to the server (such as when constructing histograms over a large domain or learning a high-dimensional model). Here we demonstrate a general approach that, under standard cryptographic assumptions, compresses every efficient LDP algorithm with negligible loss in privacy and utility guarantees. The practical implication of our result is that in typical applications every message can be compressed to the size of the server’s pseudo-random generator seed. From this general approach we derive low-communication algorithms for the problems of frequency estimation and high-dimensional mean estimation. Our algorithms are simpler and more accurate than existing low-communication LDP algorithms for these well-studied problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-feldman21a, title = {Lossless Compression of Efficient Private Local Randomizers}, author = {Feldman, Vitaly and Talwar, Kunal}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {3208--3219}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/feldman21a/feldman21a.pdf}, url = {https://proceedings.mlr.press/v139/feldman21a.html}, abstract = {Locally Differentially Private (LDP) Reports are commonly used for collection of statistics and machine learning in the federated setting. In many cases the best known LDP algorithms require sending prohibitively large messages from the client device to the server (such as when constructing histograms over a large domain or learning a high-dimensional model). Here we demonstrate a general approach that, under standard cryptographic assumptions, compresses every efficient LDP algorithm with negligible loss in privacy and utility guarantees. The practical implication of our result is that in typical applications every message can be compressed to the size of the server’s pseudo-random generator seed. From this general approach we derive low-communication algorithms for the problems of frequency estimation and high-dimensional mean estimation. Our algorithms are simpler and more accurate than existing low-communication LDP algorithms for these well-studied problems.} }
Endnote
%0 Conference Paper %T Lossless Compression of Efficient Private Local Randomizers %A Vitaly Feldman %A Kunal Talwar %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-feldman21a %I PMLR %P 3208--3219 %U https://proceedings.mlr.press/v139/feldman21a.html %V 139 %X Locally Differentially Private (LDP) Reports are commonly used for collection of statistics and machine learning in the federated setting. In many cases the best known LDP algorithms require sending prohibitively large messages from the client device to the server (such as when constructing histograms over a large domain or learning a high-dimensional model). Here we demonstrate a general approach that, under standard cryptographic assumptions, compresses every efficient LDP algorithm with negligible loss in privacy and utility guarantees. The practical implication of our result is that in typical applications every message can be compressed to the size of the server’s pseudo-random generator seed. From this general approach we derive low-communication algorithms for the problems of frequency estimation and high-dimensional mean estimation. Our algorithms are simpler and more accurate than existing low-communication LDP algorithms for these well-studied problems.
APA
Feldman, V. & Talwar, K.. (2021). Lossless Compression of Efficient Private Local Randomizers. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:3208-3219 Available from https://proceedings.mlr.press/v139/feldman21a.html.

Related Material