Locally Private k-Means in One Round

Alisa Chang, Badih Ghazi, Ravi Kumar, Pasin Manurangsi
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:1441-1451, 2021.

Abstract

We provide an approximation algorithm for k-means clustering in the \emph{one-round} (aka \emph{non-interactive}) local model of differential privacy (DP). Our algorithm achieves an approximation ratio arbitrarily close to the best \emph{non private} approximation algorithm, improving upon previously known algorithms that only guarantee large (constant) approximation ratios. Furthermore, ours is the first constant-factor approximation algorithm for k-means that requires only \emph{one} round of communication in the local DP model, positively resolving an open question of Stemmer (SODA 2020). Our algorithmic framework is quite flexible; we demonstrate this by showing that it also yields a similar near-optimal approximation algorithm in the (one-round) shuffle DP model.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-chang21a, title = {Locally Private k-Means in One Round}, author = {Chang, Alisa and Ghazi, Badih and Kumar, Ravi and Manurangsi, Pasin}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {1441--1451}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/chang21a/chang21a.pdf}, url = {https://proceedings.mlr.press/v139/chang21a.html}, abstract = {We provide an approximation algorithm for k-means clustering in the \emph{one-round} (aka \emph{non-interactive}) local model of differential privacy (DP). Our algorithm achieves an approximation ratio arbitrarily close to the best \emph{non private} approximation algorithm, improving upon previously known algorithms that only guarantee large (constant) approximation ratios. Furthermore, ours is the first constant-factor approximation algorithm for k-means that requires only \emph{one} round of communication in the local DP model, positively resolving an open question of Stemmer (SODA 2020). Our algorithmic framework is quite flexible; we demonstrate this by showing that it also yields a similar near-optimal approximation algorithm in the (one-round) shuffle DP model.} }
Endnote
%0 Conference Paper %T Locally Private k-Means in One Round %A Alisa Chang %A Badih Ghazi %A Ravi Kumar %A Pasin Manurangsi %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-chang21a %I PMLR %P 1441--1451 %U https://proceedings.mlr.press/v139/chang21a.html %V 139 %X We provide an approximation algorithm for k-means clustering in the \emph{one-round} (aka \emph{non-interactive}) local model of differential privacy (DP). Our algorithm achieves an approximation ratio arbitrarily close to the best \emph{non private} approximation algorithm, improving upon previously known algorithms that only guarantee large (constant) approximation ratios. Furthermore, ours is the first constant-factor approximation algorithm for k-means that requires only \emph{one} round of communication in the local DP model, positively resolving an open question of Stemmer (SODA 2020). Our algorithmic framework is quite flexible; we demonstrate this by showing that it also yields a similar near-optimal approximation algorithm in the (one-round) shuffle DP model.
APA
Chang, A., Ghazi, B., Kumar, R. & Manurangsi, P.. (2021). Locally Private k-Means in One Round. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:1441-1451 Available from https://proceedings.mlr.press/v139/chang21a.html.

Related Material