Differentially Private Nonparametric Regression Under a Growth Condition

Noah Golowich
Proceedings of Thirty Fourth Conference on Learning Theory, PMLR 134:2149-2192, 2021.

Abstract

Given a real-valued hypothesis class H, we investigate under what conditions there is a differentially private algorithm which learns an optimal hypothesis from H given i.i.d. data. Inspired by recent results for the related setting of binary classification (Alon et al., 2019; Bun et al., 2020), where it was shown that online learnability of a binary class is necessary and sufficient for its private learnability, Jung et al. (2020) showed that in the setting of regression, online learnability of H is necessary for private learnability. Here online learnability of H is characterized by the finiteness of its eta-sequential fat shattering dimension, sfat_eta(H), for all eta > 0. In terms of sufficient conditions for private learnability, Jung et al. (2020) showed that H is privately learnable if lim_{\eta -> 0} sfat_eta(H) is finite, which is a fairly restrictive condition. We show that under the relaxed condition liminf_{eta -> 0} eta * sfat_eta(H) = 0, H is privately learnable, thus establishing the first nonparametric private learnability guarantee for classes H with sfat_eta(H) diverging as eta -> 0. Our techniques involve a novel filtering procedure to output stable hypotheses for nonparametric function classes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v134-golowich21a, title = {Differentially Private Nonparametric Regression Under a Growth Condition}, author = {Golowich, Noah}, booktitle = {Proceedings of Thirty Fourth Conference on Learning Theory}, pages = {2149--2192}, year = {2021}, editor = {Belkin, Mikhail and Kpotufe, Samory}, volume = {134}, series = {Proceedings of Machine Learning Research}, month = {15--19 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v134/golowich21a/golowich21a.pdf}, url = {https://proceedings.mlr.press/v134/golowich21a.html}, abstract = {Given a real-valued hypothesis class H, we investigate under what conditions there is a differentially private algorithm which learns an optimal hypothesis from H given i.i.d. data. Inspired by recent results for the related setting of binary classification (Alon et al., 2019; Bun et al., 2020), where it was shown that online learnability of a binary class is necessary and sufficient for its private learnability, Jung et al. (2020) showed that in the setting of regression, online learnability of H is necessary for private learnability. Here online learnability of H is characterized by the finiteness of its eta-sequential fat shattering dimension, sfat_eta(H), for all eta > 0. In terms of sufficient conditions for private learnability, Jung et al. (2020) showed that H is privately learnable if lim_{\eta -> 0} sfat_eta(H) is finite, which is a fairly restrictive condition. We show that under the relaxed condition liminf_{eta -> 0} eta * sfat_eta(H) = 0, H is privately learnable, thus establishing the first nonparametric private learnability guarantee for classes H with sfat_eta(H) diverging as eta -> 0. Our techniques involve a novel filtering procedure to output stable hypotheses for nonparametric function classes.} }
Endnote
%0 Conference Paper %T Differentially Private Nonparametric Regression Under a Growth Condition %A Noah Golowich %B Proceedings of Thirty Fourth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2021 %E Mikhail Belkin %E Samory Kpotufe %F pmlr-v134-golowich21a %I PMLR %P 2149--2192 %U https://proceedings.mlr.press/v134/golowich21a.html %V 134 %X Given a real-valued hypothesis class H, we investigate under what conditions there is a differentially private algorithm which learns an optimal hypothesis from H given i.i.d. data. Inspired by recent results for the related setting of binary classification (Alon et al., 2019; Bun et al., 2020), where it was shown that online learnability of a binary class is necessary and sufficient for its private learnability, Jung et al. (2020) showed that in the setting of regression, online learnability of H is necessary for private learnability. Here online learnability of H is characterized by the finiteness of its eta-sequential fat shattering dimension, sfat_eta(H), for all eta > 0. In terms of sufficient conditions for private learnability, Jung et al. (2020) showed that H is privately learnable if lim_{\eta -> 0} sfat_eta(H) is finite, which is a fairly restrictive condition. We show that under the relaxed condition liminf_{eta -> 0} eta * sfat_eta(H) = 0, H is privately learnable, thus establishing the first nonparametric private learnability guarantee for classes H with sfat_eta(H) diverging as eta -> 0. Our techniques involve a novel filtering procedure to output stable hypotheses for nonparametric function classes.
APA
Golowich, N.. (2021). Differentially Private Nonparametric Regression Under a Growth Condition. Proceedings of Thirty Fourth Conference on Learning Theory, in Proceedings of Machine Learning Research 134:2149-2192 Available from https://proceedings.mlr.press/v134/golowich21a.html.

Related Material