A Generalized Neyman-Pearson Criterion for Optimal Domain Adaptation

Clayton Scott
; Proceedings of the 30th International Conference on Algorithmic Learning Theory, PMLR 98:738-761, 2019.

Abstract

In the problem of domain adaptation for binary classification, the learner is presented with labeled examples from a source domain, and must correctly classify unlabeled examples from a target domain, which may differ from the source. Previous work on this problem has assumed that the performance measure of interest is the expected value of some loss function. We study a Neyman-Pearson-like criterion and argue that, for this optimality criterion, stronger domain adaptation results are possible than what has previously been established. In particular, we study a class of domain adaptation problems that generalizes both the covariate shift assumption and a model for feature-dependent label noise, and establish optimal classification on the target domain despite not having access to labelled data from this domain.

Cite this Paper


BibTeX
@InProceedings{pmlr-v98-scott19a, title = {A Generalized Neyman-Pearson Criterion for Optimal Domain Adaptation}, author = {Scott, Clayton}, booktitle = {Proceedings of the 30th International Conference on Algorithmic Learning Theory}, pages = {738--761}, year = {2019}, editor = {Aurélien Garivier and Satyen Kale}, volume = {98}, series = {Proceedings of Machine Learning Research}, address = {Chicago, Illinois}, month = {22--24 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v98/scott19a/scott19a.pdf}, url = {http://proceedings.mlr.press/v98/scott19a.html}, abstract = {In the problem of domain adaptation for binary classification, the learner is presented with labeled examples from a source domain, and must correctly classify unlabeled examples from a target domain, which may differ from the source. Previous work on this problem has assumed that the performance measure of interest is the expected value of some loss function. We study a Neyman-Pearson-like criterion and argue that, for this optimality criterion, stronger domain adaptation results are possible than what has previously been established. In particular, we study a class of domain adaptation problems that generalizes both the covariate shift assumption and a model for feature-dependent label noise, and establish optimal classification on the target domain despite not having access to labelled data from this domain.} }
Endnote
%0 Conference Paper %T A Generalized Neyman-Pearson Criterion for Optimal Domain Adaptation %A Clayton Scott %B Proceedings of the 30th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2019 %E Aurélien Garivier %E Satyen Kale %F pmlr-v98-scott19a %I PMLR %J Proceedings of Machine Learning Research %P 738--761 %U http://proceedings.mlr.press %V 98 %W PMLR %X In the problem of domain adaptation for binary classification, the learner is presented with labeled examples from a source domain, and must correctly classify unlabeled examples from a target domain, which may differ from the source. Previous work on this problem has assumed that the performance measure of interest is the expected value of some loss function. We study a Neyman-Pearson-like criterion and argue that, for this optimality criterion, stronger domain adaptation results are possible than what has previously been established. In particular, we study a class of domain adaptation problems that generalizes both the covariate shift assumption and a model for feature-dependent label noise, and establish optimal classification on the target domain despite not having access to labelled data from this domain.
APA
Scott, C.. (2019). A Generalized Neyman-Pearson Criterion for Optimal Domain Adaptation. Proceedings of the 30th International Conference on Algorithmic Learning Theory, in PMLR 98:738-761

Related Material