Unsupervised Domain Adaptation for Anatomical Structure Detection in Ultrasound Images

Bin Pu, Xingguo Lv, Jiewen Yang, He Guannan, Xingbo Dong, Yiqun Lin, Li Shengli, Tan Ying, Liu Fei, Ming Chen, Zhe Jin, Kenli Li, Xiaomeng Li
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:41204-41220, 2024.

Abstract

Models trained on ultrasound images from one institution typically experience a decline in effectiveness when transferred directly to other institutions. Moreover, unlike natural images, dense and overlapped structures exist in fetus ultrasound images, making the detection of structures more challenging. Thus, to tackle this problem, we propose a new Unsupervised Domain Adaptation (UDA) method named ToMo-UDA for fetus structure detection, which consists of the Topology Knowledge Transfer (TKT) and the Morphology Knowledge Transfer (MKT) module. The TKT leverages prior knowledge of the medical anatomy of fetal as topological information, reconstructing and aligning anatomy features across source and target domains. Then, the MKT formulates a more consistent and independent morphological representation for each substructure of an organ. To evaluate the proposed ToMo-UDA for ultrasound fetal anatomical structure detection, we introduce FUSH$^2$, a new Fetal UltraSound benchmark, comprises Heart and Head images collected from Two health centers, with 16 annotated regions. Our experiments show that utilizing topological and morphological anatomy information in ToMo-UDA can greatly improve organ structure detection. This expands the potential for structure detection tasks in medical image analysis.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-pu24b, title = {Unsupervised Domain Adaptation for Anatomical Structure Detection in Ultrasound Images}, author = {Pu, Bin and Lv, Xingguo and Yang, Jiewen and Guannan, He and Dong, Xingbo and Lin, Yiqun and Shengli, Li and Ying, Tan and Fei, Liu and Chen, Ming and Jin, Zhe and Li, Kenli and Li, Xiaomeng}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {41204--41220}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/pu24b/pu24b.pdf}, url = {https://proceedings.mlr.press/v235/pu24b.html}, abstract = {Models trained on ultrasound images from one institution typically experience a decline in effectiveness when transferred directly to other institutions. Moreover, unlike natural images, dense and overlapped structures exist in fetus ultrasound images, making the detection of structures more challenging. Thus, to tackle this problem, we propose a new Unsupervised Domain Adaptation (UDA) method named ToMo-UDA for fetus structure detection, which consists of the Topology Knowledge Transfer (TKT) and the Morphology Knowledge Transfer (MKT) module. The TKT leverages prior knowledge of the medical anatomy of fetal as topological information, reconstructing and aligning anatomy features across source and target domains. Then, the MKT formulates a more consistent and independent morphological representation for each substructure of an organ. To evaluate the proposed ToMo-UDA for ultrasound fetal anatomical structure detection, we introduce FUSH$^2$, a new Fetal UltraSound benchmark, comprises Heart and Head images collected from Two health centers, with 16 annotated regions. Our experiments show that utilizing topological and morphological anatomy information in ToMo-UDA can greatly improve organ structure detection. This expands the potential for structure detection tasks in medical image analysis.} }
Endnote
%0 Conference Paper %T Unsupervised Domain Adaptation for Anatomical Structure Detection in Ultrasound Images %A Bin Pu %A Xingguo Lv %A Jiewen Yang %A He Guannan %A Xingbo Dong %A Yiqun Lin %A Li Shengli %A Tan Ying %A Liu Fei %A Ming Chen %A Zhe Jin %A Kenli Li %A Xiaomeng Li %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-pu24b %I PMLR %P 41204--41220 %U https://proceedings.mlr.press/v235/pu24b.html %V 235 %X Models trained on ultrasound images from one institution typically experience a decline in effectiveness when transferred directly to other institutions. Moreover, unlike natural images, dense and overlapped structures exist in fetus ultrasound images, making the detection of structures more challenging. Thus, to tackle this problem, we propose a new Unsupervised Domain Adaptation (UDA) method named ToMo-UDA for fetus structure detection, which consists of the Topology Knowledge Transfer (TKT) and the Morphology Knowledge Transfer (MKT) module. The TKT leverages prior knowledge of the medical anatomy of fetal as topological information, reconstructing and aligning anatomy features across source and target domains. Then, the MKT formulates a more consistent and independent morphological representation for each substructure of an organ. To evaluate the proposed ToMo-UDA for ultrasound fetal anatomical structure detection, we introduce FUSH$^2$, a new Fetal UltraSound benchmark, comprises Heart and Head images collected from Two health centers, with 16 annotated regions. Our experiments show that utilizing topological and morphological anatomy information in ToMo-UDA can greatly improve organ structure detection. This expands the potential for structure detection tasks in medical image analysis.
APA
Pu, B., Lv, X., Yang, J., Guannan, H., Dong, X., Lin, Y., Shengli, L., Ying, T., Fei, L., Chen, M., Jin, Z., Li, K. & Li, X.. (2024). Unsupervised Domain Adaptation for Anatomical Structure Detection in Ultrasound Images. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:41204-41220 Available from https://proceedings.mlr.press/v235/pu24b.html.

Related Material