Differentially Private Set Union

Sivakanth Gopi, Pankaj Gulhane, Janardhan Kulkarni, Judy Hanwen Shen, Milad Shokouhi, Sergey Yekhanin
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:3627-3636, 2020.

Abstract

We study the basic operation of set union in the global model of differential privacy. In this problem, we are given a universe $U$ of items, possibly of infinite size, and a database $D$ of users. Each user $i$ contributes a subset $W_i \subseteq U$ of items. We want an ($\epsilon$,$\delta$)-differentially private Algorithm which outputs a subset $S \subset \cup_i W_i$ such that the size of $S$ is as large as possible. The problem arises in countless real world applications, and is particularly ubiquitous in natural language processing (NLP) applications. For example, discovering words, sentences, $n$-grams etc., from private text data belonging to users is an instance of the set union problem. In this paper we design new algorithms for this problem that significantly outperform the best known algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-gopi20a, title = {Differentially Private Set Union}, author = {Gopi, Sivakanth and Gulhane, Pankaj and Kulkarni, Janardhan and Shen, Judy Hanwen and Shokouhi, Milad and Yekhanin, Sergey}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {3627--3636}, year = {2020}, editor = {Hal Daumé III and Aarti Singh}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/gopi20a/gopi20a.pdf}, url = { http://proceedings.mlr.press/v119/gopi20a.html }, abstract = {We study the basic operation of set union in the global model of differential privacy. In this problem, we are given a universe $U$ of items, possibly of infinite size, and a database $D$ of users. Each user $i$ contributes a subset $W_i \subseteq U$ of items. We want an ($\epsilon$,$\delta$)-differentially private Algorithm which outputs a subset $S \subset \cup_i W_i$ such that the size of $S$ is as large as possible. The problem arises in countless real world applications, and is particularly ubiquitous in natural language processing (NLP) applications. For example, discovering words, sentences, $n$-grams etc., from private text data belonging to users is an instance of the set union problem. In this paper we design new algorithms for this problem that significantly outperform the best known algorithms.} }
Endnote
%0 Conference Paper %T Differentially Private Set Union %A Sivakanth Gopi %A Pankaj Gulhane %A Janardhan Kulkarni %A Judy Hanwen Shen %A Milad Shokouhi %A Sergey Yekhanin %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-gopi20a %I PMLR %P 3627--3636 %U http://proceedings.mlr.press/v119/gopi20a.html %V 119 %X We study the basic operation of set union in the global model of differential privacy. In this problem, we are given a universe $U$ of items, possibly of infinite size, and a database $D$ of users. Each user $i$ contributes a subset $W_i \subseteq U$ of items. We want an ($\epsilon$,$\delta$)-differentially private Algorithm which outputs a subset $S \subset \cup_i W_i$ such that the size of $S$ is as large as possible. The problem arises in countless real world applications, and is particularly ubiquitous in natural language processing (NLP) applications. For example, discovering words, sentences, $n$-grams etc., from private text data belonging to users is an instance of the set union problem. In this paper we design new algorithms for this problem that significantly outperform the best known algorithms.
APA
Gopi, S., Gulhane, P., Kulkarni, J., Shen, J.H., Shokouhi, M. & Yekhanin, S.. (2020). Differentially Private Set Union. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:3627-3636 Available from http://proceedings.mlr.press/v119/gopi20a.html .

Related Material