Insufficient Statistics Perturbation: Stable Estimators for Private Least Squares Extended Abstract

Gavin Brown, Jonathan Hayase, Samuel Hopkins, Weihao Kong, Xiyang Liu, Sewoong Oh, Juan C Perdomo, Adam Smith
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:750-751, 2024.

Abstract

We present a sample- and time-efficient differentially private algorithm for ordinary least squares, with error that depends linearly on the dimension and is independent of the condition number of $X^\top X$, where $X$ is the design matrix. All prior private algorithms for this task require either $d^{3/2}$ examples, error growing polynomially with the condition number, or exponential time. Our near-optimal accuracy guarantee holds for any dataset with bounded statistical leverage and bounded residuals. Technically, we build on the approach of Brown et al. (2023) for private mean estimation, adding scaled noise to a carefully designed stable nonprivate estimator of the empirical regression vector.

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-brown24b, title = {Insufficient Statistics Perturbation: Stable Estimators for Private Least Squares Extended Abstract}, author = {Brown, Gavin and Hayase, Jonathan and Hopkins, Samuel and Kong, Weihao and Liu, Xiyang and Oh, Sewoong and Perdomo, Juan C and Smith, Adam}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {750--751}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/brown24b/brown24b.pdf}, url = {https://proceedings.mlr.press/v247/brown24b.html}, abstract = {We present a sample- and time-efficient differentially private algorithm for ordinary least squares, with error that depends linearly on the dimension and is independent of the condition number of $X^\top X$, where $X$ is the design matrix. All prior private algorithms for this task require either $d^{3/2}$ examples, error growing polynomially with the condition number, or exponential time. Our near-optimal accuracy guarantee holds for any dataset with bounded statistical leverage and bounded residuals. Technically, we build on the approach of Brown et al. (2023) for private mean estimation, adding scaled noise to a carefully designed stable nonprivate estimator of the empirical regression vector.} }
Endnote
%0 Conference Paper %T Insufficient Statistics Perturbation: Stable Estimators for Private Least Squares Extended Abstract %A Gavin Brown %A Jonathan Hayase %A Samuel Hopkins %A Weihao Kong %A Xiyang Liu %A Sewoong Oh %A Juan C Perdomo %A Adam Smith %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-brown24b %I PMLR %P 750--751 %U https://proceedings.mlr.press/v247/brown24b.html %V 247 %X We present a sample- and time-efficient differentially private algorithm for ordinary least squares, with error that depends linearly on the dimension and is independent of the condition number of $X^\top X$, where $X$ is the design matrix. All prior private algorithms for this task require either $d^{3/2}$ examples, error growing polynomially with the condition number, or exponential time. Our near-optimal accuracy guarantee holds for any dataset with bounded statistical leverage and bounded residuals. Technically, we build on the approach of Brown et al. (2023) for private mean estimation, adding scaled noise to a carefully designed stable nonprivate estimator of the empirical regression vector.
APA
Brown, G., Hayase, J., Hopkins, S., Kong, W., Liu, X., Oh, S., Perdomo, J.C. & Smith, A.. (2024). Insufficient Statistics Perturbation: Stable Estimators for Private Least Squares Extended Abstract. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:750-751 Available from https://proceedings.mlr.press/v247/brown24b.html.

Related Material