On Convex Optimization with Semi-Sensitive Features

Badih Ghazi, Pritish Kamath, Ravi Kumar, Pasin Manurangsi, Raghu Meka, Chiyuan Zhang
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:1916-1938, 2024.

Abstract

We study the differentially private (DP) empirical risk minimization (ERM) problem under the \emph{semi-sensitive DP} setting where only some features are sensitive. This generalizes the Label DP setting where only the label is sensitive. We give improved upper and lower bounds on the excess risk for DP-ERM. In particular, we show that the error only scales polylogarithmically in terms of the sensitive domain size, improving upon previous results that scale polynomially in the size of the sensitive domain (Ghazi et al., NeurIPS 2021).

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-ghazi24a, title = {On Convex Optimization with Semi-Sensitive Features}, author = {Ghazi, Badih and Kamath, Pritish and Kumar, Ravi and Manurangsi, Pasin and Meka, Raghu and Zhang, Chiyuan}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {1916--1938}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/ghazi24a/ghazi24a.pdf}, url = {https://proceedings.mlr.press/v247/ghazi24a.html}, abstract = {We study the differentially private (DP) empirical risk minimization (ERM) problem under the \emph{semi-sensitive DP} setting where only some features are sensitive. This generalizes the Label DP setting where only the label is sensitive. We give improved upper and lower bounds on the excess risk for DP-ERM. In particular, we show that the error only scales polylogarithmically in terms of the sensitive domain size, improving upon previous results that scale polynomially in the size of the sensitive domain (Ghazi et al., NeurIPS 2021).} }
Endnote
%0 Conference Paper %T On Convex Optimization with Semi-Sensitive Features %A Badih Ghazi %A Pritish Kamath %A Ravi Kumar %A Pasin Manurangsi %A Raghu Meka %A Chiyuan Zhang %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-ghazi24a %I PMLR %P 1916--1938 %U https://proceedings.mlr.press/v247/ghazi24a.html %V 247 %X We study the differentially private (DP) empirical risk minimization (ERM) problem under the \emph{semi-sensitive DP} setting where only some features are sensitive. This generalizes the Label DP setting where only the label is sensitive. We give improved upper and lower bounds on the excess risk for DP-ERM. In particular, we show that the error only scales polylogarithmically in terms of the sensitive domain size, improving upon previous results that scale polynomially in the size of the sensitive domain (Ghazi et al., NeurIPS 2021).
APA
Ghazi, B., Kamath, P., Kumar, R., Manurangsi, P., Meka, R. & Zhang, C.. (2024). On Convex Optimization with Semi-Sensitive Features. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:1916-1938 Available from https://proceedings.mlr.press/v247/ghazi24a.html.

Related Material