Private Center Points and Learning of Halfspaces

Amos Beimel, Shay Moran, Kobbi Nissim, Uri Stemmer
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:269-282, 2019.

Abstract

We present a private agnostic learner for halfspaces over an arbitrary finite domain $X\subset \R^d$ with sample complexity $\mathsf{poly}(d,2^{\log^*|X|})$. The building block for this learner is a differentially private algorithm for locating an approximate center point of $m>\mathsf{poly}(d,2^{\log^*|X|})$ points – a high dimensional generalization of the median function. Our construction establishes a relationship between these two problems that is reminiscent of the relation between the median and learning one-dimensional thresholds [Bun et al. FOCS ’15]. This relationship suggests that the problem of privately locating a center point may have further applications in the design of differentially private algorithms. We also provide a lower bound on the sample complexity for privately finding a point in the convex hull. For approximate differential privacy, we show a lower bound of $m=\Omega(d+\log^*|X|)$, whereas for pure differential privacy $m=\Omega(d\log|X|)$.

Cite this Paper


BibTeX
@InProceedings{pmlr-v99-beimel19a, title = {Private Center Points and Learning of Halfspaces}, author = {Beimel, Amos and and Moran, Shay and Nissim, Kobbi and Stemmer, Uri}, booktitle = {Proceedings of the Thirty-Second Conference on Learning Theory}, pages = {269--282}, year = {2019}, editor = {Beygelzimer, Alina and Hsu, Daniel}, volume = {99}, series = {Proceedings of Machine Learning Research}, month = {25--28 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v99/beimel19a/beimel19a.pdf}, url = {https://proceedings.mlr.press/v99/beimel19a.html}, abstract = { We present a private agnostic learner for halfspaces over an arbitrary finite domain $X\subset \R^d$ with sample complexity $\mathsf{poly}(d,2^{\log^*|X|})$. The building block for this learner is a differentially private algorithm for locating an approximate center point of $m>\mathsf{poly}(d,2^{\log^*|X|})$ points – a high dimensional generalization of the median function. Our construction establishes a relationship between these two problems that is reminiscent of the relation between the median and learning one-dimensional thresholds [Bun et al. FOCS ’15]. This relationship suggests that the problem of privately locating a center point may have further applications in the design of differentially private algorithms. We also provide a lower bound on the sample complexity for privately finding a point in the convex hull. For approximate differential privacy, we show a lower bound of $m=\Omega(d+\log^*|X|)$, whereas for pure differential privacy $m=\Omega(d\log|X|)$. } }
Endnote
%0 Conference Paper %T Private Center Points and Learning of Halfspaces %A Amos Beimel %A Shay Moran %A Kobbi Nissim %A Uri Stemmer %B Proceedings of the Thirty-Second Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2019 %E Alina Beygelzimer %E Daniel Hsu %F pmlr-v99-beimel19a %I PMLR %P 269--282 %U https://proceedings.mlr.press/v99/beimel19a.html %V 99 %X We present a private agnostic learner for halfspaces over an arbitrary finite domain $X\subset \R^d$ with sample complexity $\mathsf{poly}(d,2^{\log^*|X|})$. The building block for this learner is a differentially private algorithm for locating an approximate center point of $m>\mathsf{poly}(d,2^{\log^*|X|})$ points – a high dimensional generalization of the median function. Our construction establishes a relationship between these two problems that is reminiscent of the relation between the median and learning one-dimensional thresholds [Bun et al. FOCS ’15]. This relationship suggests that the problem of privately locating a center point may have further applications in the design of differentially private algorithms. We also provide a lower bound on the sample complexity for privately finding a point in the convex hull. For approximate differential privacy, we show a lower bound of $m=\Omega(d+\log^*|X|)$, whereas for pure differential privacy $m=\Omega(d\log|X|)$.
APA
Beimel, A., Moran, S., Nissim, K. & Stemmer, U.. (2019). Private Center Points and Learning of Halfspaces. Proceedings of the Thirty-Second Conference on Learning Theory, in Proceedings of Machine Learning Research 99:269-282 Available from https://proceedings.mlr.press/v99/beimel19a.html.

Related Material