[edit]
Finite-Sample Symmetric Mean Estimation with Fisher Information Rate
Proceedings of Thirty Sixth Conference on Learning Theory, PMLR 195:4777-4830, 2023.
Abstract
The mean of an unknown variance-$\sigma^2$ distribution $f$ can be estimated from $n$ samples with variance $\frac{\sigma^2}{n}$ and nearly corresponding subgaussian rate. When $f$ is known up to translation, this can be improved asymptotically to $\frac{1}{nI}$, where $I$ is the Fisher information of the distribution. Such an improvement is not possible for general unknown $f$, but [Stone 1975] showed that this asymptotic convergence \emph{is} possible if $f$ is _symmetric_ about its mean. Stone’s bound is asymptotic, however: the $n$ required for convergence depends in an unspecified way on the distribution $f$ and failure probability $\delta$. In this paper we give finite-sample guarantees for symmetric mean estimation in terms of Fisher information. For every $f, n, \delta$ with $n > \log \frac{1}{\delta}$, we get convergence close to a subgaussian with variance $\frac{1}{n I_r}$, where $I_r$ is the $r$-smoothed Fisher information with smoothing radius $r$ that decays polynomially in $n$. Such a bound essentially matches the finite-sample guarantees in the known-$f$ setting.