A Generalized NeymanPearson Criterion for Optimal Domain Adaptation
[edit]
Proceedings of the 30th International Conference on Algorithmic Learning Theory, PMLR 98:738761, 2019.
Abstract
In the problem of domain adaptation for binary classification, the learner is presented with labeled examples from a source domain, and must correctly classify unlabeled examples from a target domain, which may differ from the source. Previous work on this problem has assumed that the performance measure of interest is the expected value of some loss function. We study a NeymanPearsonlike criterion and argue that, for this optimality criterion, stronger domain adaptation results are possible than what has previously been established. In particular, we study a class of domain adaptation problems that generalizes both the covariate shift assumption and a model for featuredependent label noise, and establish optimal classification on the target domain despite not having access to labelled data from this domain.
Related Material


