A Simple yet Effective Adaptive Inter-organ Contrastive Learning Framework for Unsupervised Domain Adaptation

Yiyou Sun, Zheyao Gao, Xiaogen Zhou, Qi Dou, Winnie Chiu Wing Chu
Proceedings of The 9th International Conference on Medical Imaging with Deep Learning, PMLR 315:1523-1538, 2026.

Abstract

Strong unsupervised domain adaptation (UDA) in multi-organ segmentation seeks to unify complementary information from heterogeneous imaging protocols within a single model without sacrificing source-modality performance, yet the substantial domain gap between modalities makes feature-level alignment non-trivial. Pseudo-label learning (PLL) has emerged as the dominant paradigm, but it suffers from information loss due to hard thresholding and bias introduced by class imbalance and noisy predictions. Contrastive learning (CL) offers a complementary direction by structuring semantic constrast, yet existing voxel-level formulations incur prohibitive computational costs on volumetric data and fail to capture the global anatomical context critical for organ segmentation. In this work, we propose Adaptive Inter-organ Contrastive Learning (AICL), a unified UDA framework for 3D multi-organ cross-modality segmentation that exploits PPL and CL synergistically to facilitate better cross-modality feature alignment. AICL employs dynamic soft pseudo-labels as guidance in the feature latent space to organize for inter-organ samples as positive-negative pairs for CL. Meanwhile, the model is trained with supervised consistency learning (SCL) using mixed ground truths and pseudo-labels, promoting a more discriminative and compact shared latent space. Extensive experiments and ablation studies on an orbital and a cardiac dataset reveal the effectiveness of each component and a significant advancement in segmentation results.

Cite this Paper


BibTeX
@InProceedings{pmlr-v315-sun26a, title = {A Simple yet Effective Adaptive Inter-organ Contrastive Learning Framework for Unsupervised Domain Adaptation}, author = {Sun, Yiyou and Gao, Zheyao and Zhou, Xiaogen and Dou, Qi and Chiu Wing Chu, Winnie}, booktitle = {Proceedings of The 9th International Conference on Medical Imaging with Deep Learning}, pages = {1523--1538}, year = {2026}, editor = {Huo, Yuankai and Gao, Mingchen and Kuo, Chang-Fu and Jin, Yueming and Deng, Ruining}, volume = {315}, series = {Proceedings of Machine Learning Research}, month = {08--10 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v315/main/assets/sun26a/sun26a.pdf}, url = {https://proceedings.mlr.press/v315/sun26a.html}, abstract = {Strong unsupervised domain adaptation (UDA) in multi-organ segmentation seeks to unify complementary information from heterogeneous imaging protocols within a single model without sacrificing source-modality performance, yet the substantial domain gap between modalities makes feature-level alignment non-trivial. Pseudo-label learning (PLL) has emerged as the dominant paradigm, but it suffers from information loss due to hard thresholding and bias introduced by class imbalance and noisy predictions. Contrastive learning (CL) offers a complementary direction by structuring semantic constrast, yet existing voxel-level formulations incur prohibitive computational costs on volumetric data and fail to capture the global anatomical context critical for organ segmentation. In this work, we propose Adaptive Inter-organ Contrastive Learning (AICL), a unified UDA framework for 3D multi-organ cross-modality segmentation that exploits PPL and CL synergistically to facilitate better cross-modality feature alignment. AICL employs dynamic soft pseudo-labels as guidance in the feature latent space to organize for inter-organ samples as positive-negative pairs for CL. Meanwhile, the model is trained with supervised consistency learning (SCL) using mixed ground truths and pseudo-labels, promoting a more discriminative and compact shared latent space. Extensive experiments and ablation studies on an orbital and a cardiac dataset reveal the effectiveness of each component and a significant advancement in segmentation results.} }
Endnote
%0 Conference Paper %T A Simple yet Effective Adaptive Inter-organ Contrastive Learning Framework for Unsupervised Domain Adaptation %A Yiyou Sun %A Zheyao Gao %A Xiaogen Zhou %A Qi Dou %A Winnie Chiu Wing Chu %B Proceedings of The 9th International Conference on Medical Imaging with Deep Learning %C Proceedings of Machine Learning Research %D 2026 %E Yuankai Huo %E Mingchen Gao %E Chang-Fu Kuo %E Yueming Jin %E Ruining Deng %F pmlr-v315-sun26a %I PMLR %P 1523--1538 %U https://proceedings.mlr.press/v315/sun26a.html %V 315 %X Strong unsupervised domain adaptation (UDA) in multi-organ segmentation seeks to unify complementary information from heterogeneous imaging protocols within a single model without sacrificing source-modality performance, yet the substantial domain gap between modalities makes feature-level alignment non-trivial. Pseudo-label learning (PLL) has emerged as the dominant paradigm, but it suffers from information loss due to hard thresholding and bias introduced by class imbalance and noisy predictions. Contrastive learning (CL) offers a complementary direction by structuring semantic constrast, yet existing voxel-level formulations incur prohibitive computational costs on volumetric data and fail to capture the global anatomical context critical for organ segmentation. In this work, we propose Adaptive Inter-organ Contrastive Learning (AICL), a unified UDA framework for 3D multi-organ cross-modality segmentation that exploits PPL and CL synergistically to facilitate better cross-modality feature alignment. AICL employs dynamic soft pseudo-labels as guidance in the feature latent space to organize for inter-organ samples as positive-negative pairs for CL. Meanwhile, the model is trained with supervised consistency learning (SCL) using mixed ground truths and pseudo-labels, promoting a more discriminative and compact shared latent space. Extensive experiments and ablation studies on an orbital and a cardiac dataset reveal the effectiveness of each component and a significant advancement in segmentation results.
APA
Sun, Y., Gao, Z., Zhou, X., Dou, Q. & Chiu Wing Chu, W.. (2026). A Simple yet Effective Adaptive Inter-organ Contrastive Learning Framework for Unsupervised Domain Adaptation. Proceedings of The 9th International Conference on Medical Imaging with Deep Learning, in Proceedings of Machine Learning Research 315:1523-1538 Available from https://proceedings.mlr.press/v315/sun26a.html.

Related Material