Finite-Sample Symmetric Mean Estimation with Fisher Information Rate

Shivam Gupta, Jasper C. H. Lee, Eric Price
Proceedings of Thirty Sixth Conference on Learning Theory, PMLR 195:4777-4830, 2023.

Abstract

The mean of an unknown variance-$\sigma^2$ distribution $f$ can be estimated from $n$ samples with variance $\frac{\sigma^2}{n}$ and nearly corresponding subgaussian rate. When $f$ is known up to translation, this can be improved asymptotically to $\frac{1}{nI}$, where $I$ is the Fisher information of the distribution. Such an improvement is not possible for general unknown $f$, but [Stone 1975] showed that this asymptotic convergence \emph{is} possible if $f$ is _symmetric_ about its mean. Stone’s bound is asymptotic, however: the $n$ required for convergence depends in an unspecified way on the distribution $f$ and failure probability $\delta$. In this paper we give finite-sample guarantees for symmetric mean estimation in terms of Fisher information. For every $f, n, \delta$ with $n > \log \frac{1}{\delta}$, we get convergence close to a subgaussian with variance $\frac{1}{n I_r}$, where $I_r$ is the $r$-smoothed Fisher information with smoothing radius $r$ that decays polynomially in $n$. Such a bound essentially matches the finite-sample guarantees in the known-$f$ setting.

Cite this Paper


BibTeX
@InProceedings{pmlr-v195-gupta23a, title = {Finite-Sample Symmetric Mean Estimation with Fisher Information Rate}, author = {Gupta, Shivam and Lee, Jasper C. H. and Price, Eric}, booktitle = {Proceedings of Thirty Sixth Conference on Learning Theory}, pages = {4777--4830}, year = {2023}, editor = {Neu, Gergely and Rosasco, Lorenzo}, volume = {195}, series = {Proceedings of Machine Learning Research}, month = {12--15 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v195/gupta23a/gupta23a.pdf}, url = {https://proceedings.mlr.press/v195/gupta23a.html}, abstract = {The mean of an unknown variance-$\sigma^2$ distribution $f$ can be estimated from $n$ samples with variance $\frac{\sigma^2}{n}$ and nearly corresponding subgaussian rate. When $f$ is known up to translation, this can be improved asymptotically to $\frac{1}{nI}$, where $I$ is the Fisher information of the distribution. Such an improvement is not possible for general unknown $f$, but [Stone 1975] showed that this asymptotic convergence \emph{is} possible if $f$ is _symmetric_ about its mean. Stone’s bound is asymptotic, however: the $n$ required for convergence depends in an unspecified way on the distribution $f$ and failure probability $\delta$. In this paper we give finite-sample guarantees for symmetric mean estimation in terms of Fisher information. For every $f, n, \delta$ with $n > \log \frac{1}{\delta}$, we get convergence close to a subgaussian with variance $\frac{1}{n I_r}$, where $I_r$ is the $r$-smoothed Fisher information with smoothing radius $r$ that decays polynomially in $n$. Such a bound essentially matches the finite-sample guarantees in the known-$f$ setting.} }
Endnote
%0 Conference Paper %T Finite-Sample Symmetric Mean Estimation with Fisher Information Rate %A Shivam Gupta %A Jasper C. H. Lee %A Eric Price %B Proceedings of Thirty Sixth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2023 %E Gergely Neu %E Lorenzo Rosasco %F pmlr-v195-gupta23a %I PMLR %P 4777--4830 %U https://proceedings.mlr.press/v195/gupta23a.html %V 195 %X The mean of an unknown variance-$\sigma^2$ distribution $f$ can be estimated from $n$ samples with variance $\frac{\sigma^2}{n}$ and nearly corresponding subgaussian rate. When $f$ is known up to translation, this can be improved asymptotically to $\frac{1}{nI}$, where $I$ is the Fisher information of the distribution. Such an improvement is not possible for general unknown $f$, but [Stone 1975] showed that this asymptotic convergence \emph{is} possible if $f$ is _symmetric_ about its mean. Stone’s bound is asymptotic, however: the $n$ required for convergence depends in an unspecified way on the distribution $f$ and failure probability $\delta$. In this paper we give finite-sample guarantees for symmetric mean estimation in terms of Fisher information. For every $f, n, \delta$ with $n > \log \frac{1}{\delta}$, we get convergence close to a subgaussian with variance $\frac{1}{n I_r}$, where $I_r$ is the $r$-smoothed Fisher information with smoothing radius $r$ that decays polynomially in $n$. Such a bound essentially matches the finite-sample guarantees in the known-$f$ setting.
APA
Gupta, S., Lee, J.C.H. & Price, E.. (2023). Finite-Sample Symmetric Mean Estimation with Fisher Information Rate. Proceedings of Thirty Sixth Conference on Learning Theory, in Proceedings of Machine Learning Research 195:4777-4830 Available from https://proceedings.mlr.press/v195/gupta23a.html.

Related Material