Domain Adaptation with Coupled Subspaces

John Blitzer, Sham Kakade, Dean Foster
Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, PMLR 15:173-181, 2011.

Abstract

Domain adaptation algorithms address a key issue in applied machine learning: How can we train a system under a source distribution but achieve high performance under a different target distribution? We tackle this question for divergent distributions where crucial predictive target features may not even have support under the source distribution. In this setting, the key intuition is that that if we can link target-specific features to source features, we can learn effectively using only source labeled data. We formalize this intuition, as well as the assumptions under which such coupled learning is possible. This allows us to give finite sample target error bounds (using only source training data) and an algorithm which performs at the state-of-the-art on two natural language processing adaptation tasks which are characterized by novel target features.

Cite this Paper


BibTeX
@InProceedings{pmlr-v15-blitzer11a, title = {Domain Adaptation with Coupled Subspaces}, author = {Blitzer, John and Kakade, Sham and Foster, Dean}, booktitle = {Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics}, pages = {173--181}, year = {2011}, editor = {Gordon, Geoffrey and Dunson, David and Dudík, Miroslav}, volume = {15}, series = {Proceedings of Machine Learning Research}, address = {Fort Lauderdale, FL, USA}, month = {11--13 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v15/blitzer11a/blitzer11a.pdf}, url = {https://proceedings.mlr.press/v15/blitzer11a.html}, abstract = {Domain adaptation algorithms address a key issue in applied machine learning: How can we train a system under a source distribution but achieve high performance under a different target distribution? We tackle this question for divergent distributions where crucial predictive target features may not even have support under the source distribution. In this setting, the key intuition is that that if we can link target-specific features to source features, we can learn effectively using only source labeled data. We formalize this intuition, as well as the assumptions under which such coupled learning is possible. This allows us to give finite sample target error bounds (using only source training data) and an algorithm which performs at the state-of-the-art on two natural language processing adaptation tasks which are characterized by novel target features.} }
Endnote
%0 Conference Paper %T Domain Adaptation with Coupled Subspaces %A John Blitzer %A Sham Kakade %A Dean Foster %B Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2011 %E Geoffrey Gordon %E David Dunson %E Miroslav Dudík %F pmlr-v15-blitzer11a %I PMLR %P 173--181 %U https://proceedings.mlr.press/v15/blitzer11a.html %V 15 %X Domain adaptation algorithms address a key issue in applied machine learning: How can we train a system under a source distribution but achieve high performance under a different target distribution? We tackle this question for divergent distributions where crucial predictive target features may not even have support under the source distribution. In this setting, the key intuition is that that if we can link target-specific features to source features, we can learn effectively using only source labeled data. We formalize this intuition, as well as the assumptions under which such coupled learning is possible. This allows us to give finite sample target error bounds (using only source training data) and an algorithm which performs at the state-of-the-art on two natural language processing adaptation tasks which are characterized by novel target features.
RIS
TY - CPAPER TI - Domain Adaptation with Coupled Subspaces AU - John Blitzer AU - Sham Kakade AU - Dean Foster BT - Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics DA - 2011/06/14 ED - Geoffrey Gordon ED - David Dunson ED - Miroslav Dudík ID - pmlr-v15-blitzer11a PB - PMLR DP - Proceedings of Machine Learning Research VL - 15 SP - 173 EP - 181 L1 - http://proceedings.mlr.press/v15/blitzer11a/blitzer11a.pdf UR - https://proceedings.mlr.press/v15/blitzer11a.html AB - Domain adaptation algorithms address a key issue in applied machine learning: How can we train a system under a source distribution but achieve high performance under a different target distribution? We tackle this question for divergent distributions where crucial predictive target features may not even have support under the source distribution. In this setting, the key intuition is that that if we can link target-specific features to source features, we can learn effectively using only source labeled data. We formalize this intuition, as well as the assumptions under which such coupled learning is possible. This allows us to give finite sample target error bounds (using only source training data) and an algorithm which performs at the state-of-the-art on two natural language processing adaptation tasks which are characterized by novel target features. ER -
APA
Blitzer, J., Kakade, S. & Foster, D.. (2011). Domain Adaptation with Coupled Subspaces. Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 15:173-181 Available from https://proceedings.mlr.press/v15/blitzer11a.html.

Related Material