Efficient and Provable Algorithms for Covariate Shift

Deeksha Adil, Jaroslaw Blasiok
Proceedings of The 37th International Conference on Algorithmic Learning Theory, PMLR 313:1-34, 2026.

Abstract

Covariate shift, a widely used assumption in tackling _distributional shift_ (when training and test distributions differ), focuses on scenarios where the distribution of the labels conditioned on the feature vector is the same, but the distribution of features in the training and test data are different. Despite the significance and extensive work on covariate shift, theoretical guarantees for algorithms in this domain remain sparse. In this paper, we distill the essence of the covariate shift problem and focus on estimating the average $E_{\widetilde{x}\sim p_{\mathrm{test}}}f(\widetilde{x})$, of any unknown and bounded function $f$, given labeled training samples $(x_i, f(x_i))$, and unlabeled test samples $\widetilde{x}_i$; this is a core subroutine for several widely studied learning problems. We give several efficient algorithms, with provable sample complexity and computational guarantees.

Cite this Paper


BibTeX
@InProceedings{pmlr-v313-adil26a, title = {Efficient and Provable Algorithms for Covariate Shift}, author = {Adil, Deeksha and Blasiok, Jaroslaw}, booktitle = {Proceedings of The 37th International Conference on Algorithmic Learning Theory}, pages = {1--34}, year = {2026}, editor = {Telgarsky, Matus and Ullman, Jonathan}, volume = {313}, series = {Proceedings of Machine Learning Research}, month = {23--26 Feb}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v313/main/assets/adil26a/adil26a.pdf}, url = {https://proceedings.mlr.press/v313/adil26a.html}, abstract = {Covariate shift, a widely used assumption in tackling _distributional shift_ (when training and test distributions differ), focuses on scenarios where the distribution of the labels conditioned on the feature vector is the same, but the distribution of features in the training and test data are different. Despite the significance and extensive work on covariate shift, theoretical guarantees for algorithms in this domain remain sparse. In this paper, we distill the essence of the covariate shift problem and focus on estimating the average $E_{\widetilde{x}\sim p_{\mathrm{test}}}f(\widetilde{x})$, of any unknown and bounded function $f$, given labeled training samples $(x_i, f(x_i))$, and unlabeled test samples $\widetilde{x}_i$; this is a core subroutine for several widely studied learning problems. We give several efficient algorithms, with provable sample complexity and computational guarantees.} }
Endnote
%0 Conference Paper %T Efficient and Provable Algorithms for Covariate Shift %A Deeksha Adil %A Jaroslaw Blasiok %B Proceedings of The 37th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2026 %E Matus Telgarsky %E Jonathan Ullman %F pmlr-v313-adil26a %I PMLR %P 1--34 %U https://proceedings.mlr.press/v313/adil26a.html %V 313 %X Covariate shift, a widely used assumption in tackling _distributional shift_ (when training and test distributions differ), focuses on scenarios where the distribution of the labels conditioned on the feature vector is the same, but the distribution of features in the training and test data are different. Despite the significance and extensive work on covariate shift, theoretical guarantees for algorithms in this domain remain sparse. In this paper, we distill the essence of the covariate shift problem and focus on estimating the average $E_{\widetilde{x}\sim p_{\mathrm{test}}}f(\widetilde{x})$, of any unknown and bounded function $f$, given labeled training samples $(x_i, f(x_i))$, and unlabeled test samples $\widetilde{x}_i$; this is a core subroutine for several widely studied learning problems. We give several efficient algorithms, with provable sample complexity and computational guarantees.
APA
Adil, D. & Blasiok, J.. (2026). Efficient and Provable Algorithms for Covariate Shift. Proceedings of The 37th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 313:1-34 Available from https://proceedings.mlr.press/v313/adil26a.html.

Related Material