Universal Bias Reduction in Estimation of Smooth Additive Function in High Dimensions

Fan Zhou, Ping Li, Cun-Hui Zhang
Proceedings of The 34th International Conference on Algorithmic Learning Theory, PMLR 201:1555-1578, 2023.

Abstract

Suppose we observe $\mathbf{x}_j = \boldsymbol{\theta} + \boldsymbol{\varepsilon}_j$, $j=1,...,n$ where $\boldsymbol{\theta} \in \mathbb{R}^d$ is an unknown parameter and $\boldsymbol{\varepsilon}_j$ are i.i.d. random noise vectors satisfying some general distribution. We study the estimation of $f(\boldsymbol{\theta}):= \sum_{i=1}^d f_i(\theta_i)$ when $f:\mathbb{R}^d\rightarrow \mathbb{R}$ is a given smooth additive function and $d$ is large. Inspired by a recent work on studying the estimation of $f(\boldsymbol{\theta})$ under Gaussian shift model via a Fourier analytical approach, we propose a new estimator that can be implemented easily and computed fast. We show that the new estimator achieves effective bias reduction universally under minimum moment constraint. Further, we establish its asymptotic normality which implies the new estimator is asymptotically efficient. When $f_i$ is sufficiently smooth and $d$ is large, such properties make the new estimator rate optimal. Efficient computation of the new estimator and the minimum requirement of noise make this work more applicable to real world applications.

Cite this Paper


BibTeX
@InProceedings{pmlr-v201-zhou23a, title = {Universal Bias Reduction in Estimation of Smooth Additive Function in High Dimensions}, author = {Zhou, Fan and Li, Ping and Zhang, Cun-Hui}, booktitle = {Proceedings of The 34th International Conference on Algorithmic Learning Theory}, pages = {1555--1578}, year = {2023}, editor = {Agrawal, Shipra and Orabona, Francesco}, volume = {201}, series = {Proceedings of Machine Learning Research}, month = {20 Feb--23 Feb}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v201/zhou23a/zhou23a.pdf}, url = {https://proceedings.mlr.press/v201/zhou23a.html}, abstract = {Suppose we observe $\mathbf{x}_j = \boldsymbol{\theta} + \boldsymbol{\varepsilon}_j$, $j=1,...,n$ where $\boldsymbol{\theta} \in \mathbb{R}^d$ is an unknown parameter and $\boldsymbol{\varepsilon}_j$ are i.i.d. random noise vectors satisfying some general distribution. We study the estimation of $f(\boldsymbol{\theta}):= \sum_{i=1}^d f_i(\theta_i)$ when $f:\mathbb{R}^d\rightarrow \mathbb{R}$ is a given smooth additive function and $d$ is large. Inspired by a recent work on studying the estimation of $f(\boldsymbol{\theta})$ under Gaussian shift model via a Fourier analytical approach, we propose a new estimator that can be implemented easily and computed fast. We show that the new estimator achieves effective bias reduction universally under minimum moment constraint. Further, we establish its asymptotic normality which implies the new estimator is asymptotically efficient. When $f_i$ is sufficiently smooth and $d$ is large, such properties make the new estimator rate optimal. Efficient computation of the new estimator and the minimum requirement of noise make this work more applicable to real world applications.} }
Endnote
%0 Conference Paper %T Universal Bias Reduction in Estimation of Smooth Additive Function in High Dimensions %A Fan Zhou %A Ping Li %A Cun-Hui Zhang %B Proceedings of The 34th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2023 %E Shipra Agrawal %E Francesco Orabona %F pmlr-v201-zhou23a %I PMLR %P 1555--1578 %U https://proceedings.mlr.press/v201/zhou23a.html %V 201 %X Suppose we observe $\mathbf{x}_j = \boldsymbol{\theta} + \boldsymbol{\varepsilon}_j$, $j=1,...,n$ where $\boldsymbol{\theta} \in \mathbb{R}^d$ is an unknown parameter and $\boldsymbol{\varepsilon}_j$ are i.i.d. random noise vectors satisfying some general distribution. We study the estimation of $f(\boldsymbol{\theta}):= \sum_{i=1}^d f_i(\theta_i)$ when $f:\mathbb{R}^d\rightarrow \mathbb{R}$ is a given smooth additive function and $d$ is large. Inspired by a recent work on studying the estimation of $f(\boldsymbol{\theta})$ under Gaussian shift model via a Fourier analytical approach, we propose a new estimator that can be implemented easily and computed fast. We show that the new estimator achieves effective bias reduction universally under minimum moment constraint. Further, we establish its asymptotic normality which implies the new estimator is asymptotically efficient. When $f_i$ is sufficiently smooth and $d$ is large, such properties make the new estimator rate optimal. Efficient computation of the new estimator and the minimum requirement of noise make this work more applicable to real world applications.
APA
Zhou, F., Li, P. & Zhang, C.. (2023). Universal Bias Reduction in Estimation of Smooth Additive Function in High Dimensions. Proceedings of The 34th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 201:1555-1578 Available from https://proceedings.mlr.press/v201/zhou23a.html.

Related Material