[edit]
Domain Adaptation under Target and Conditional Shift
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):819-827, 2013.
Abstract
Let X denote the feature and Y the target. We consider domain adaptation under three possible scenarios: (1) the marginal P_Y changes, while the conditional P_X|Y stays the same (\it target shift), (2) the marginal P_Y is fixed, while the conditional P_X|Y changes with certain constraints (\it conditional shift), and (3) the marginal P_Y changes, and the conditional P_X|Y changes with constraints (\it generalized target shift). Using background knowledge, causal interpretations allow us to determine the correct situation for a problem at hand. We exploit importance reweighting or sample transformation to find the learning machine that works well on test data, and propose to estimate the weights or transformations by \it reweighting or transforming training data to reproduce the covariate distribution on the test domain. Thanks to kernel embedding of conditional as well as marginal distributions, the proposed approaches avoid distribution estimation, and are applicable for high-dimensional problems. Numerical evaluations on synthetic and real-world datasets demonstrate the effectiveness of the proposed framework.