i3Deep: Efficient 3D interactive segmentation with the nnU-Net

Karol Gotkowski, Camila Gonzalez, Isabel Kaltenborn, Ricarda Fischbach, Andreas Bucher, Anirban Mukhopadhyay
Proceedings of The 5th International Conference on Medical Imaging with Deep Learning, PMLR 172:441-456, 2022.

Abstract

3D interactive segmentation is highly relevant in reducing the annotation time for experts. However, current methods often achieve only small segmentation improvements per interaction as lightweight models are a requirement to ensure near-realtime usage. Models with better predictive performance such as the nnU-Net cannot be employed for interactive segmentation due to their high computational demands, which result in long inference times. To solve this issue, we propose the 3D interactive segmentation framework i3Deep. Slices are selected through uncertainty estimation in an offline setting and afterwards corrected by an expert. The slices are then fed to a refinement nnU-Net, which significantly improves the global 3D segmentation from the local corrections. This approach bypasses the issue of long inference times by moving expensive computations into an offline setting that does not include the expert. For three different anatomies, our approach reduces the workload of the expert by 80.3%, while significantly improving the Dice by up to 39.5%, outperforming other state-of-the-art methods by a clear margin. Even on out-of-distribution data i3Deep is able to improve the segmentation by 19.3%.

Cite this Paper


BibTeX
@InProceedings{pmlr-v172-gotkowski22a, title = {i3Deep: Efficient 3D interactive segmentation with the nnU-Net}, author = {Gotkowski, Karol and Gonzalez, Camila and Kaltenborn, Isabel and Fischbach, Ricarda and Bucher, Andreas and Mukhopadhyay, Anirban}, booktitle = {Proceedings of The 5th International Conference on Medical Imaging with Deep Learning}, pages = {441--456}, year = {2022}, editor = {Konukoglu, Ender and Menze, Bjoern and Venkataraman, Archana and Baumgartner, Christian and Dou, Qi and Albarqouni, Shadi}, volume = {172}, series = {Proceedings of Machine Learning Research}, month = {06--08 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v172/gotkowski22a/gotkowski22a.pdf}, url = {https://proceedings.mlr.press/v172/gotkowski22a.html}, abstract = {3D interactive segmentation is highly relevant in reducing the annotation time for experts. However, current methods often achieve only small segmentation improvements per interaction as lightweight models are a requirement to ensure near-realtime usage. Models with better predictive performance such as the nnU-Net cannot be employed for interactive segmentation due to their high computational demands, which result in long inference times. To solve this issue, we propose the 3D interactive segmentation framework i3Deep. Slices are selected through uncertainty estimation in an offline setting and afterwards corrected by an expert. The slices are then fed to a refinement nnU-Net, which significantly improves the global 3D segmentation from the local corrections. This approach bypasses the issue of long inference times by moving expensive computations into an offline setting that does not include the expert. For three different anatomies, our approach reduces the workload of the expert by 80.3%, while significantly improving the Dice by up to 39.5%, outperforming other state-of-the-art methods by a clear margin. Even on out-of-distribution data i3Deep is able to improve the segmentation by 19.3%.} }
Endnote
%0 Conference Paper %T i3Deep: Efficient 3D interactive segmentation with the nnU-Net %A Karol Gotkowski %A Camila Gonzalez %A Isabel Kaltenborn %A Ricarda Fischbach %A Andreas Bucher %A Anirban Mukhopadhyay %B Proceedings of The 5th International Conference on Medical Imaging with Deep Learning %C Proceedings of Machine Learning Research %D 2022 %E Ender Konukoglu %E Bjoern Menze %E Archana Venkataraman %E Christian Baumgartner %E Qi Dou %E Shadi Albarqouni %F pmlr-v172-gotkowski22a %I PMLR %P 441--456 %U https://proceedings.mlr.press/v172/gotkowski22a.html %V 172 %X 3D interactive segmentation is highly relevant in reducing the annotation time for experts. However, current methods often achieve only small segmentation improvements per interaction as lightweight models are a requirement to ensure near-realtime usage. Models with better predictive performance such as the nnU-Net cannot be employed for interactive segmentation due to their high computational demands, which result in long inference times. To solve this issue, we propose the 3D interactive segmentation framework i3Deep. Slices are selected through uncertainty estimation in an offline setting and afterwards corrected by an expert. The slices are then fed to a refinement nnU-Net, which significantly improves the global 3D segmentation from the local corrections. This approach bypasses the issue of long inference times by moving expensive computations into an offline setting that does not include the expert. For three different anatomies, our approach reduces the workload of the expert by 80.3%, while significantly improving the Dice by up to 39.5%, outperforming other state-of-the-art methods by a clear margin. Even on out-of-distribution data i3Deep is able to improve the segmentation by 19.3%.
APA
Gotkowski, K., Gonzalez, C., Kaltenborn, I., Fischbach, R., Bucher, A. & Mukhopadhyay, A.. (2022). i3Deep: Efficient 3D interactive segmentation with the nnU-Net. Proceedings of The 5th International Conference on Medical Imaging with Deep Learning, in Proceedings of Machine Learning Research 172:441-456 Available from https://proceedings.mlr.press/v172/gotkowski22a.html.

Related Material