Optimal Transport meets Noisy Label Robust Loss and MixUp Regularization for Domain Adaptation

Kilian Fatras, Hiroki Naganuma, Ioannis Mitliagkas
Proceedings of The 1st Conference on Lifelong Learning Agents, PMLR 199:966-981, 2022.

Abstract

It is common in computer vision to be confronted with domain shift: images which have the same class but different acquisition conditions. In domain adaptation (DA), one wants to classify unlabeled target images using source labeled images. Unfortunately, deep neural networks trained on a source training set perform poorly on target images which do not belong to the training domain. One strategy to improve these performances is to align the source and target image distributions in an embedded space using optimal transport (OT). To compute OT, most methods use the minibatch optimal transport approximation which causes negative transfer, i.e. aligning samples with different labels, and leads to overfitting. In this work, we mitigate negative alignment by explaining it as a noisy label assignment to target images. We then mitigate its effect by appropriate regularization. We propose to couple the MixUp regularization with a loss that is robust to noisy labels in order to improve domain adaptation performance. We show in an extensive ablation study that a combination of the two techniques is critical to achieve improved performance. Finally, we evaluate our method, called mixunbot, on several benchmarks and real-world DA problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v199-fatras22a, title = {Optimal Transport meets Noisy Label Robust Loss and MixUp Regularization for Domain Adaptation}, author = {Fatras, Kilian and Naganuma, Hiroki and Mitliagkas, Ioannis}, booktitle = {Proceedings of The 1st Conference on Lifelong Learning Agents}, pages = {966--981}, year = {2022}, editor = {Chandar, Sarath and Pascanu, Razvan and Precup, Doina}, volume = {199}, series = {Proceedings of Machine Learning Research}, month = {22--24 Aug}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v199/fatras22a/fatras22a.pdf}, url = {https://proceedings.mlr.press/v199/fatras22a.html}, abstract = {It is common in computer vision to be confronted with domain shift: images which have the same class but different acquisition conditions. In domain adaptation (DA), one wants to classify unlabeled target images using source labeled images. Unfortunately, deep neural networks trained on a source training set perform poorly on target images which do not belong to the training domain. One strategy to improve these performances is to align the source and target image distributions in an embedded space using optimal transport (OT). To compute OT, most methods use the minibatch optimal transport approximation which causes negative transfer, i.e. aligning samples with different labels, and leads to overfitting. In this work, we mitigate negative alignment by explaining it as a noisy label assignment to target images. We then mitigate its effect by appropriate regularization. We propose to couple the MixUp regularization with a loss that is robust to noisy labels in order to improve domain adaptation performance. We show in an extensive ablation study that a combination of the two techniques is critical to achieve improved performance. Finally, we evaluate our method, called mixunbot, on several benchmarks and real-world DA problems.} }
Endnote
%0 Conference Paper %T Optimal Transport meets Noisy Label Robust Loss and MixUp Regularization for Domain Adaptation %A Kilian Fatras %A Hiroki Naganuma %A Ioannis Mitliagkas %B Proceedings of The 1st Conference on Lifelong Learning Agents %C Proceedings of Machine Learning Research %D 2022 %E Sarath Chandar %E Razvan Pascanu %E Doina Precup %F pmlr-v199-fatras22a %I PMLR %P 966--981 %U https://proceedings.mlr.press/v199/fatras22a.html %V 199 %X It is common in computer vision to be confronted with domain shift: images which have the same class but different acquisition conditions. In domain adaptation (DA), one wants to classify unlabeled target images using source labeled images. Unfortunately, deep neural networks trained on a source training set perform poorly on target images which do not belong to the training domain. One strategy to improve these performances is to align the source and target image distributions in an embedded space using optimal transport (OT). To compute OT, most methods use the minibatch optimal transport approximation which causes negative transfer, i.e. aligning samples with different labels, and leads to overfitting. In this work, we mitigate negative alignment by explaining it as a noisy label assignment to target images. We then mitigate its effect by appropriate regularization. We propose to couple the MixUp regularization with a loss that is robust to noisy labels in order to improve domain adaptation performance. We show in an extensive ablation study that a combination of the two techniques is critical to achieve improved performance. Finally, we evaluate our method, called mixunbot, on several benchmarks and real-world DA problems.
APA
Fatras, K., Naganuma, H. & Mitliagkas, I.. (2022). Optimal Transport meets Noisy Label Robust Loss and MixUp Regularization for Domain Adaptation. Proceedings of The 1st Conference on Lifelong Learning Agents, in Proceedings of Machine Learning Research 199:966-981 Available from https://proceedings.mlr.press/v199/fatras22a.html.

Related Material