Theoretical Performance Guarantees for Partial Domain Adaptation via Partial Optimal Transport

Jayadev Naram, Fredrik Hellström, Ziming Wang, Rebecka Jörnsten, Giuseppe Durisi
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:45663-45681, 2025.

Abstract

In many scenarios of practical interest, labeled data from a target distribution are scarce while labeled data from a related source distribution are abundant. One particular setting of interest arises when the target label space is a subset of the source label space, leading to the framework of partial domain adaptation (PDA). Typical approaches to PDA involve minimizing a domain alignment term and a weighted empirical loss on the source data, with the aim of transferring knowledge between domains. However, a theoretical basis for this procedure is lacking, and in particular, most existing weighting schemes are heuristic. In this work, we derive generalization bounds for the PDA problem based on partial optimal transport. These bounds corroborate the use of the partial Wasserstein distance as a domain alignment term, and lead to theoretically motivated explicit expressions for the empirical source loss weights. Inspired by these bounds, we devise a practical algorithm for PDA, termed WARMPOT. Through extensive numerical experiments, we show that WARMPOT is competitive with recent approaches, and that our proposed weights improve on existing schemes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-naram25a, title = {Theoretical Performance Guarantees for Partial Domain Adaptation via Partial Optimal Transport}, author = {Naram, Jayadev and Hellstr\"{o}m, Fredrik and Wang, Ziming and J\"{o}rnsten, Rebecka and Durisi, Giuseppe}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {45663--45681}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/naram25a/naram25a.pdf}, url = {https://proceedings.mlr.press/v267/naram25a.html}, abstract = {In many scenarios of practical interest, labeled data from a target distribution are scarce while labeled data from a related source distribution are abundant. One particular setting of interest arises when the target label space is a subset of the source label space, leading to the framework of partial domain adaptation (PDA). Typical approaches to PDA involve minimizing a domain alignment term and a weighted empirical loss on the source data, with the aim of transferring knowledge between domains. However, a theoretical basis for this procedure is lacking, and in particular, most existing weighting schemes are heuristic. In this work, we derive generalization bounds for the PDA problem based on partial optimal transport. These bounds corroborate the use of the partial Wasserstein distance as a domain alignment term, and lead to theoretically motivated explicit expressions for the empirical source loss weights. Inspired by these bounds, we devise a practical algorithm for PDA, termed WARMPOT. Through extensive numerical experiments, we show that WARMPOT is competitive with recent approaches, and that our proposed weights improve on existing schemes.} }
Endnote
%0 Conference Paper %T Theoretical Performance Guarantees for Partial Domain Adaptation via Partial Optimal Transport %A Jayadev Naram %A Fredrik Hellström %A Ziming Wang %A Rebecka Jörnsten %A Giuseppe Durisi %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-naram25a %I PMLR %P 45663--45681 %U https://proceedings.mlr.press/v267/naram25a.html %V 267 %X In many scenarios of practical interest, labeled data from a target distribution are scarce while labeled data from a related source distribution are abundant. One particular setting of interest arises when the target label space is a subset of the source label space, leading to the framework of partial domain adaptation (PDA). Typical approaches to PDA involve minimizing a domain alignment term and a weighted empirical loss on the source data, with the aim of transferring knowledge between domains. However, a theoretical basis for this procedure is lacking, and in particular, most existing weighting schemes are heuristic. In this work, we derive generalization bounds for the PDA problem based on partial optimal transport. These bounds corroborate the use of the partial Wasserstein distance as a domain alignment term, and lead to theoretically motivated explicit expressions for the empirical source loss weights. Inspired by these bounds, we devise a practical algorithm for PDA, termed WARMPOT. Through extensive numerical experiments, we show that WARMPOT is competitive with recent approaches, and that our proposed weights improve on existing schemes.
APA
Naram, J., Hellström, F., Wang, Z., Jörnsten, R. & Durisi, G.. (2025). Theoretical Performance Guarantees for Partial Domain Adaptation via Partial Optimal Transport. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:45663-45681 Available from https://proceedings.mlr.press/v267/naram25a.html.

Related Material