Near-Optimal Linear Regression under Distribution Shift

Qi Lei, Wei Hu, Jason Lee
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:6164-6174, 2021.

Abstract

Transfer learning is essential when sufficient data comes from the source domain, with scarce labeled data from the target domain. We develop estimators that achieve minimax linear risk for linear regression problems under distribution shift. Our algorithms cover different transfer learning settings including covariate shift and model shift. We also consider when data are generated from either linear or general nonlinear models. We show that linear minimax estimators are within an absolute constant of the minimax risk even among nonlinear estimators for various source/target distributions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-lei21a, title = {Near-Optimal Linear Regression under Distribution Shift}, author = {Lei, Qi and Hu, Wei and Lee, Jason}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {6164--6174}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/lei21a/lei21a.pdf}, url = {https://proceedings.mlr.press/v139/lei21a.html}, abstract = {Transfer learning is essential when sufficient data comes from the source domain, with scarce labeled data from the target domain. We develop estimators that achieve minimax linear risk for linear regression problems under distribution shift. Our algorithms cover different transfer learning settings including covariate shift and model shift. We also consider when data are generated from either linear or general nonlinear models. We show that linear minimax estimators are within an absolute constant of the minimax risk even among nonlinear estimators for various source/target distributions.} }
Endnote
%0 Conference Paper %T Near-Optimal Linear Regression under Distribution Shift %A Qi Lei %A Wei Hu %A Jason Lee %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-lei21a %I PMLR %P 6164--6174 %U https://proceedings.mlr.press/v139/lei21a.html %V 139 %X Transfer learning is essential when sufficient data comes from the source domain, with scarce labeled data from the target domain. We develop estimators that achieve minimax linear risk for linear regression problems under distribution shift. Our algorithms cover different transfer learning settings including covariate shift and model shift. We also consider when data are generated from either linear or general nonlinear models. We show that linear minimax estimators are within an absolute constant of the minimax risk even among nonlinear estimators for various source/target distributions.
APA
Lei, Q., Hu, W. & Lee, J.. (2021). Near-Optimal Linear Regression under Distribution Shift. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:6164-6174 Available from https://proceedings.mlr.press/v139/lei21a.html.

Related Material