Parametric Bootstrap for Differentially Private Confidence Intervals

Cecilia Ferrando, Shufan Wang, Daniel Sheldon
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:1598-1618, 2022.

Abstract

The goal of this paper is to develop a practical and general-purpose approach to construct confidence intervals for differentially private parametric estimation. We find that the parametric bootstrap is a simple and effective solution. It cleanly reasons about variability of both the data sample and the randomized privacy mechanism and applies "out of the box" to a wide class of private estimation routines. It can also help correct bias caused by clipping data to limit sensitivity. We prove that the parametric bootstrap gives consistent confidence intervals in two broadly relevant settings, including a novel adaptation to linear regression that avoids accessing the covariate data multiple times. We demonstrate its effectiveness for a variety of estimators, and find empirically that it provides confidence intervals with good coverage even at modest sample sizes and performs better than alternative approaches.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-ferrando22a, title = { Parametric Bootstrap for Differentially Private Confidence Intervals }, author = {Ferrando, Cecilia and Wang, Shufan and Sheldon, Daniel}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {1598--1618}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/ferrando22a/ferrando22a.pdf}, url = {https://proceedings.mlr.press/v151/ferrando22a.html}, abstract = { The goal of this paper is to develop a practical and general-purpose approach to construct confidence intervals for differentially private parametric estimation. We find that the parametric bootstrap is a simple and effective solution. It cleanly reasons about variability of both the data sample and the randomized privacy mechanism and applies "out of the box" to a wide class of private estimation routines. It can also help correct bias caused by clipping data to limit sensitivity. We prove that the parametric bootstrap gives consistent confidence intervals in two broadly relevant settings, including a novel adaptation to linear regression that avoids accessing the covariate data multiple times. We demonstrate its effectiveness for a variety of estimators, and find empirically that it provides confidence intervals with good coverage even at modest sample sizes and performs better than alternative approaches. } }
Endnote
%0 Conference Paper %T Parametric Bootstrap for Differentially Private Confidence Intervals %A Cecilia Ferrando %A Shufan Wang %A Daniel Sheldon %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-ferrando22a %I PMLR %P 1598--1618 %U https://proceedings.mlr.press/v151/ferrando22a.html %V 151 %X The goal of this paper is to develop a practical and general-purpose approach to construct confidence intervals for differentially private parametric estimation. We find that the parametric bootstrap is a simple and effective solution. It cleanly reasons about variability of both the data sample and the randomized privacy mechanism and applies "out of the box" to a wide class of private estimation routines. It can also help correct bias caused by clipping data to limit sensitivity. We prove that the parametric bootstrap gives consistent confidence intervals in two broadly relevant settings, including a novel adaptation to linear regression that avoids accessing the covariate data multiple times. We demonstrate its effectiveness for a variety of estimators, and find empirically that it provides confidence intervals with good coverage even at modest sample sizes and performs better than alternative approaches.
APA
Ferrando, C., Wang, S. & Sheldon, D.. (2022). Parametric Bootstrap for Differentially Private Confidence Intervals . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:1598-1618 Available from https://proceedings.mlr.press/v151/ferrando22a.html.

Related Material