[edit]
High-dimensional Location Estimation via Norm Concentration for Subgamma Vectors
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:12132-12164, 2023.
Abstract
In location estimation, we are given n samples from a known distribution f shifted by an unknown translation λ, and want to estimate λ as precisely as possible. Asymptotically, the maximum likelihood estimate achieves the Cramér-Rao bound of error N(0,1nI), where I is the Fisher information of f. However, the n required for convergence depends on f, and may be arbitrarily large. We build on the theory using smoothed estimators to bound the error for finite n in terms of Ir, the Fisher information of the r-smoothed distribution. As n→∞, r→0 at an explicit rate and this converges to the Cramér-Rao bound. We (1) improve the prior work for 1-dimensional f to converge for constant failure probability in addition to high probability, and (2) extend the theory to high-dimensional distributions. In the process, we prove a new bound on the norm of a high-dimensional random variable whose 1-dimensional projections are subgamma, which may be of independent interest.