[edit]
Unsupervised Domain Adaptation for Medical Image Segmentation with Dynamic Prototype-based Contrastive Learning
Proceedings of the fifth Conference on Health, Inference, and Learning, PMLR 248:312-325, 2024.
Abstract
Medical image segmentation typically requires numerous dense annotations in the target domain to train models, which is time-consuming and labour-intensive. To alleviate this challenge, unsupervised domain adaptation (UDA) has emerged to enhance model generalization in the target domain by harnessing labeled data from the source domain along with unlabeled data from the target domain. In this paper, we introduce a novel Dynamic Prototype Contrastive Learning (DPCL) framework for UDA on medical image segmentation, which dynamically updates cross-domain global prototypes and excavates implicit discrepancy information in a contrastive manner. DPCL learns cross-domain global feature representations while enhancing the discriminative capability of the segmentation model. Specifically, we design a novel cross-domain prototype evolution module that generates evolved cross-domain prototypes by employing dynamic updating and evolutionary strategies. This module facilitates a gradual transition from the source to the target domain while acquiring cross-domain global guidance knowledge. Moreover, we devise a cross-domain embedding contrastive module that establishes contrastive relationships within the embedding space. This module captures both homogeneous and heterogeneous information within the same category and among different categories, thereby enhancing the discriminative capability of the segmentation model. Experimental results demonstrate that the proposed DPCL is effective and outperforms the state-of-the-art methods.