[edit]
Universal Bias Reduction in Estimation of Smooth Additive Function in High Dimensions
Proceedings of The 34th International Conference on Algorithmic Learning Theory, PMLR 201:1555-1578, 2023.
Abstract
Suppose we observe $\mathbf{x}_j = \boldsymbol{\theta} + \boldsymbol{\varepsilon}_j$, $j=1,...,n$ where $\boldsymbol{\theta} \in \mathbb{R}^d$ is an unknown parameter and $\boldsymbol{\varepsilon}_j$ are i.i.d. random noise vectors satisfying some general distribution. We study the estimation of $f(\boldsymbol{\theta}):= \sum_{i=1}^d f_i(\theta_i)$ when $f:\mathbb{R}^d\rightarrow \mathbb{R}$ is a given smooth additive function and $d$ is large. Inspired by a recent work on studying the estimation of $f(\boldsymbol{\theta})$ under Gaussian shift model via a Fourier analytical approach, we propose a new estimator that can be implemented easily and computed fast. We show that the new estimator achieves effective bias reduction universally under minimum moment constraint. Further, we establish its asymptotic normality which implies the new estimator is asymptotically efficient. When $f_i$ is sufficiently smooth and $d$ is large, such properties make the new estimator rate optimal. Efficient computation of the new estimator and the minimum requirement of noise make this work more applicable to real world applications.