Efficient Estimation of a Gaussian Mean with Local Differential Privacy

Kalinin Nikita, Lukas Steinberger
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:118-126, 2025.

Abstract

In this paper, we study the problem of estimating the unknown mean $\theta$ of a unit variance Gaussian distribution in a locally differentially private (LDP) way. In the high-privacy regime ($\epsilon\le 1$), we identify an optimal privacy mechanism that minimizes the variance of the estimator asymptotically. Our main technical contribution is the maximization of the Fisher-Information of the sanitized data with respect to the local privacy mechanism $Q$. We find that the exact solution $Q_{\theta,\epsilon}$ of this maximization is the sign mechanism that applies randomized response to the sign of $X_i-\theta$, where $X_1,…, X_n$ are the confidential iid original samples. However, since this optimal local mechanism depends on the unknown mean $\theta$, we employ a two-stage LDP parameter estimation procedure which requires splitting agents into two groups. The first $n_1$ observations are used to consistently but not necessarily efficiently estimate the parameter $\theta$ by $\tilde{\theta_{n_1}}$. Then this estimate is updated by applying the sign mechanism with $\tilde{\theta}_{n_1}$ instead of $\theta$ to the remaining $n-n_1$ observations, to obtain an LDP and efficient estimator of the unknown mean.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-nikita25a, title = {Efficient Estimation of a Gaussian Mean with Local Differential Privacy}, author = {Nikita, Kalinin and Steinberger, Lukas}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {118--126}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/nikita25a/nikita25a.pdf}, url = {https://proceedings.mlr.press/v258/nikita25a.html}, abstract = {In this paper, we study the problem of estimating the unknown mean $\theta$ of a unit variance Gaussian distribution in a locally differentially private (LDP) way. In the high-privacy regime ($\epsilon\le 1$), we identify an optimal privacy mechanism that minimizes the variance of the estimator asymptotically. Our main technical contribution is the maximization of the Fisher-Information of the sanitized data with respect to the local privacy mechanism $Q$. We find that the exact solution $Q_{\theta,\epsilon}$ of this maximization is the sign mechanism that applies randomized response to the sign of $X_i-\theta$, where $X_1,…, X_n$ are the confidential iid original samples. However, since this optimal local mechanism depends on the unknown mean $\theta$, we employ a two-stage LDP parameter estimation procedure which requires splitting agents into two groups. The first $n_1$ observations are used to consistently but not necessarily efficiently estimate the parameter $\theta$ by $\tilde{\theta_{n_1}}$. Then this estimate is updated by applying the sign mechanism with $\tilde{\theta}_{n_1}$ instead of $\theta$ to the remaining $n-n_1$ observations, to obtain an LDP and efficient estimator of the unknown mean.} }
Endnote
%0 Conference Paper %T Efficient Estimation of a Gaussian Mean with Local Differential Privacy %A Kalinin Nikita %A Lukas Steinberger %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-nikita25a %I PMLR %P 118--126 %U https://proceedings.mlr.press/v258/nikita25a.html %V 258 %X In this paper, we study the problem of estimating the unknown mean $\theta$ of a unit variance Gaussian distribution in a locally differentially private (LDP) way. In the high-privacy regime ($\epsilon\le 1$), we identify an optimal privacy mechanism that minimizes the variance of the estimator asymptotically. Our main technical contribution is the maximization of the Fisher-Information of the sanitized data with respect to the local privacy mechanism $Q$. We find that the exact solution $Q_{\theta,\epsilon}$ of this maximization is the sign mechanism that applies randomized response to the sign of $X_i-\theta$, where $X_1,…, X_n$ are the confidential iid original samples. However, since this optimal local mechanism depends on the unknown mean $\theta$, we employ a two-stage LDP parameter estimation procedure which requires splitting agents into two groups. The first $n_1$ observations are used to consistently but not necessarily efficiently estimate the parameter $\theta$ by $\tilde{\theta_{n_1}}$. Then this estimate is updated by applying the sign mechanism with $\tilde{\theta}_{n_1}$ instead of $\theta$ to the remaining $n-n_1$ observations, to obtain an LDP and efficient estimator of the unknown mean.
APA
Nikita, K. & Steinberger, L.. (2025). Efficient Estimation of a Gaussian Mean with Local Differential Privacy. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:118-126 Available from https://proceedings.mlr.press/v258/nikita25a.html.

Related Material