Selective Nonparametric Regression via Testing

Fedor Noskov, Alexander Fishkov, Maxim Panov
Proceedings of the 15th Asian Conference on Machine Learning, PMLR 222:1023-1038, 2024.

Abstract

Prediction with the possibility of abstention (or selective prediction) is an important problem for error-critical machine learning applications. While well-studied in the classification setup, selective approaches to regression are much less developed. In this work, we consider the nonparametric heteroskedastic regression problem and develop an abstention procedure via testing the hypothesis on the value of the conditional variance at a given point. Unlike existing methods, the proposed one allows to account not only for the value of the variance itself but also for the uncertainty of the corresponding variance predictor. We prove non-asymptotic bounds on the risk of the resulting estimator and show the existence of several different convergence regimes. Theoretical analysis is illustrated with a series of experiments on simulated and real-world data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v222-noskov24a, title = {Selective Nonparametric Regression via Testing}, author = {Noskov, Fedor and Fishkov, Alexander and Panov, Maxim}, booktitle = {Proceedings of the 15th Asian Conference on Machine Learning}, pages = {1023--1038}, year = {2024}, editor = {Yanıkoğlu, Berrin and Buntine, Wray}, volume = {222}, series = {Proceedings of Machine Learning Research}, month = {11--14 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v222/noskov24a/noskov24a.pdf}, url = {https://proceedings.mlr.press/v222/noskov24a.html}, abstract = {Prediction with the possibility of abstention (or selective prediction) is an important problem for error-critical machine learning applications. While well-studied in the classification setup, selective approaches to regression are much less developed. In this work, we consider the nonparametric heteroskedastic regression problem and develop an abstention procedure via testing the hypothesis on the value of the conditional variance at a given point. Unlike existing methods, the proposed one allows to account not only for the value of the variance itself but also for the uncertainty of the corresponding variance predictor. We prove non-asymptotic bounds on the risk of the resulting estimator and show the existence of several different convergence regimes. Theoretical analysis is illustrated with a series of experiments on simulated and real-world data.} }
Endnote
%0 Conference Paper %T Selective Nonparametric Regression via Testing %A Fedor Noskov %A Alexander Fishkov %A Maxim Panov %B Proceedings of the 15th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Berrin Yanıkoğlu %E Wray Buntine %F pmlr-v222-noskov24a %I PMLR %P 1023--1038 %U https://proceedings.mlr.press/v222/noskov24a.html %V 222 %X Prediction with the possibility of abstention (or selective prediction) is an important problem for error-critical machine learning applications. While well-studied in the classification setup, selective approaches to regression are much less developed. In this work, we consider the nonparametric heteroskedastic regression problem and develop an abstention procedure via testing the hypothesis on the value of the conditional variance at a given point. Unlike existing methods, the proposed one allows to account not only for the value of the variance itself but also for the uncertainty of the corresponding variance predictor. We prove non-asymptotic bounds on the risk of the resulting estimator and show the existence of several different convergence regimes. Theoretical analysis is illustrated with a series of experiments on simulated and real-world data.
APA
Noskov, F., Fishkov, A. & Panov, M.. (2024). Selective Nonparametric Regression via Testing. Proceedings of the 15th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 222:1023-1038 Available from https://proceedings.mlr.press/v222/noskov24a.html.

Related Material