[edit]
Near-Optimal Linear Regression under Distribution Shift
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:6164-6174, 2021.
Abstract
Transfer learning is essential when sufficient data comes from the source domain, with scarce labeled data from the target domain. We develop estimators that achieve minimax linear risk for linear regression problems under distribution shift. Our algorithms cover different transfer learning settings including covariate shift and model shift. We also consider when data are generated from either linear or general nonlinear models. We show that linear minimax estimators are within an absolute constant of the minimax risk even among nonlinear estimators for various source/target distributions.