Sharp Theoretical Analysis for Nonparametric Testing under Random Projection

Meimei Liu, Zuofeng Shang, Guang Cheng
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:2175-2209, 2019.

Abstract

A common challenge in nonparametric inference is its high computational complexity when data volume is large. In this paper, we develop computationally efficient nonparametric testing by employing a random projection strategy. In the specific kernel ridge regression setup, a simple distance-based test statistic is proposed. Notably, we derive the minimum number of random projections that is sufficient for achieving testing optimality in terms of the minimax rate. As a by-product, the lower bound of projection dimension for minimax optimal estimation derived in Yang (2017) is proven to be sharp. One technical contribution is to establish upper bounds for a range of tail sums of empirical kernel eigenvalues.

Cite this Paper


BibTeX
@InProceedings{pmlr-v99-liu19a, title = {Sharp Theoretical Analysis for Nonparametric Testing under Random Projection}, author = {Liu, Meimei and Shang, Zuofeng and Cheng, Guang}, booktitle = {Proceedings of the Thirty-Second Conference on Learning Theory}, pages = {2175--2209}, year = {2019}, editor = {Beygelzimer, Alina and Hsu, Daniel}, volume = {99}, series = {Proceedings of Machine Learning Research}, month = {25--28 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v99/liu19a/liu19a.pdf}, url = {https://proceedings.mlr.press/v99/liu19a.html}, abstract = {A common challenge in nonparametric inference is its high computational complexity when data volume is large. In this paper, we develop computationally efficient nonparametric testing by employing a random projection strategy. In the specific kernel ridge regression setup, a simple distance-based test statistic is proposed. Notably, we derive the minimum number of random projections that is sufficient for achieving testing optimality in terms of the minimax rate. As a by-product, the lower bound of projection dimension for minimax optimal estimation derived in Yang (2017) is proven to be sharp. One technical contribution is to establish upper bounds for a range of tail sums of empirical kernel eigenvalues. } }
Endnote
%0 Conference Paper %T Sharp Theoretical Analysis for Nonparametric Testing under Random Projection %A Meimei Liu %A Zuofeng Shang %A Guang Cheng %B Proceedings of the Thirty-Second Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2019 %E Alina Beygelzimer %E Daniel Hsu %F pmlr-v99-liu19a %I PMLR %P 2175--2209 %U https://proceedings.mlr.press/v99/liu19a.html %V 99 %X A common challenge in nonparametric inference is its high computational complexity when data volume is large. In this paper, we develop computationally efficient nonparametric testing by employing a random projection strategy. In the specific kernel ridge regression setup, a simple distance-based test statistic is proposed. Notably, we derive the minimum number of random projections that is sufficient for achieving testing optimality in terms of the minimax rate. As a by-product, the lower bound of projection dimension for minimax optimal estimation derived in Yang (2017) is proven to be sharp. One technical contribution is to establish upper bounds for a range of tail sums of empirical kernel eigenvalues.
APA
Liu, M., Shang, Z. & Cheng, G.. (2019). Sharp Theoretical Analysis for Nonparametric Testing under Random Projection. Proceedings of the Thirty-Second Conference on Learning Theory, in Proceedings of Machine Learning Research 99:2175-2209 Available from https://proceedings.mlr.press/v99/liu19a.html.

Related Material