Non-asymptotic Analysis for Nonparametric Testing

Yun Yang, Zuofeng Shang, Guang Cheng
Proceedings of Thirty Third Conference on Learning Theory, PMLR 125:3709-3755, 2020.

Abstract

We develop a non-asymptotic framework for hypothesis testing in nonparametric regression where the true regression function belongs to a Sobolev space. Our statistical guarantees are exact in the sense that Type I and II errors are controlled for any finite sample size. Meanwhile, one proposed test is shown to achieve minimax rate optimality in the asymptotic sense. An important consequence of this non-asymptotic theory is a new and practically useful formula for selecting the optimal smoothing parameter in the testing statistic. Extensions of our results to general reproducing kernel Hilbert spaces and non-Gaussian error regression are also discussed.

Cite this Paper


BibTeX
@InProceedings{pmlr-v125-yang20a, title = {Non-asymptotic Analysis for Nonparametric Testing}, author = {Yang, Yun and Shang, Zuofeng and Cheng, Guang}, booktitle = {Proceedings of Thirty Third Conference on Learning Theory}, pages = {3709--3755}, year = {2020}, editor = {Abernethy, Jacob and Agarwal, Shivani}, volume = {125}, series = {Proceedings of Machine Learning Research}, month = {09--12 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v125/yang20a/yang20a.pdf}, url = {https://proceedings.mlr.press/v125/yang20a.html}, abstract = { We develop a non-asymptotic framework for hypothesis testing in nonparametric regression where the true regression function belongs to a Sobolev space. Our statistical guarantees are exact in the sense that Type I and II errors are controlled for any finite sample size. Meanwhile, one proposed test is shown to achieve minimax rate optimality in the asymptotic sense. An important consequence of this non-asymptotic theory is a new and practically useful formula for selecting the optimal smoothing parameter in the testing statistic. Extensions of our results to general reproducing kernel Hilbert spaces and non-Gaussian error regression are also discussed.} }
Endnote
%0 Conference Paper %T Non-asymptotic Analysis for Nonparametric Testing %A Yun Yang %A Zuofeng Shang %A Guang Cheng %B Proceedings of Thirty Third Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2020 %E Jacob Abernethy %E Shivani Agarwal %F pmlr-v125-yang20a %I PMLR %P 3709--3755 %U https://proceedings.mlr.press/v125/yang20a.html %V 125 %X We develop a non-asymptotic framework for hypothesis testing in nonparametric regression where the true regression function belongs to a Sobolev space. Our statistical guarantees are exact in the sense that Type I and II errors are controlled for any finite sample size. Meanwhile, one proposed test is shown to achieve minimax rate optimality in the asymptotic sense. An important consequence of this non-asymptotic theory is a new and practically useful formula for selecting the optimal smoothing parameter in the testing statistic. Extensions of our results to general reproducing kernel Hilbert spaces and non-Gaussian error regression are also discussed.
APA
Yang, Y., Shang, Z. & Cheng, G.. (2020). Non-asymptotic Analysis for Nonparametric Testing. Proceedings of Thirty Third Conference on Learning Theory, in Proceedings of Machine Learning Research 125:3709-3755 Available from https://proceedings.mlr.press/v125/yang20a.html.

Related Material