Orientation Estimation of Abdominal Ultrasound Images with Multi-Hypotheses Networks

Timo Horstmann, Oliver Zettinig, Wolfgang Wein, Raphael Prevost
Proceedings of The 5th International Conference on Medical Imaging with Deep Learning, PMLR 172:521-534, 2022.

Abstract

Ultrasound imaging can provide valuable information to clinicians during interventions, in particular when fused with other modalities. Multi-modal image registration algorithms however require a somewhat accurate initialization, which is particularly difficult to estimate for ultrasound images as their orientation is arbitrary and their content ambiguous (limited field of view, artifacts, etc.). In this work, we not only train neural networks to predict the absolute orientation of ultrasound frames, but also to produce a confidence for each prediction. This allows us to select only the most confident frames in the clip. Our networks are trained to produce multiple hypotheses using a simple yet overlooked meta-loss that is specifically designed to capture the ambiguity of the input data. We show on several abdominal ultrasound datasets that multi-hypotheses networks provide better uncertainty estimates than Monte-Carlo dropout while being more efficient than network ensembling. Generic, easy to implement and able to quantify both data ambiguity and out-of-distribution samples, they represent a preferable alternative to traditional baselines for uncertainty estimation. On a clinical test our method produces estimates within $20^{\circ}$ of the true orientation, which we can use to improve the accuracy of a subsequent registration algorithm down to less than $10^{\circ}$.

Cite this Paper


BibTeX
@InProceedings{pmlr-v172-horstmann22a, title = {Orientation Estimation of Abdominal Ultrasound Images with Multi-Hypotheses Networks}, author = {Horstmann, Timo and Zettinig, Oliver and Wein, Wolfgang and Prevost, Raphael}, booktitle = {Proceedings of The 5th International Conference on Medical Imaging with Deep Learning}, pages = {521--534}, year = {2022}, editor = {Konukoglu, Ender and Menze, Bjoern and Venkataraman, Archana and Baumgartner, Christian and Dou, Qi and Albarqouni, Shadi}, volume = {172}, series = {Proceedings of Machine Learning Research}, month = {06--08 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v172/horstmann22a/horstmann22a.pdf}, url = {https://proceedings.mlr.press/v172/horstmann22a.html}, abstract = {Ultrasound imaging can provide valuable information to clinicians during interventions, in particular when fused with other modalities. Multi-modal image registration algorithms however require a somewhat accurate initialization, which is particularly difficult to estimate for ultrasound images as their orientation is arbitrary and their content ambiguous (limited field of view, artifacts, etc.). In this work, we not only train neural networks to predict the absolute orientation of ultrasound frames, but also to produce a confidence for each prediction. This allows us to select only the most confident frames in the clip. Our networks are trained to produce multiple hypotheses using a simple yet overlooked meta-loss that is specifically designed to capture the ambiguity of the input data. We show on several abdominal ultrasound datasets that multi-hypotheses networks provide better uncertainty estimates than Monte-Carlo dropout while being more efficient than network ensembling. Generic, easy to implement and able to quantify both data ambiguity and out-of-distribution samples, they represent a preferable alternative to traditional baselines for uncertainty estimation. On a clinical test our method produces estimates within $20^{\circ}$ of the true orientation, which we can use to improve the accuracy of a subsequent registration algorithm down to less than $10^{\circ}$.} }
Endnote
%0 Conference Paper %T Orientation Estimation of Abdominal Ultrasound Images with Multi-Hypotheses Networks %A Timo Horstmann %A Oliver Zettinig %A Wolfgang Wein %A Raphael Prevost %B Proceedings of The 5th International Conference on Medical Imaging with Deep Learning %C Proceedings of Machine Learning Research %D 2022 %E Ender Konukoglu %E Bjoern Menze %E Archana Venkataraman %E Christian Baumgartner %E Qi Dou %E Shadi Albarqouni %F pmlr-v172-horstmann22a %I PMLR %P 521--534 %U https://proceedings.mlr.press/v172/horstmann22a.html %V 172 %X Ultrasound imaging can provide valuable information to clinicians during interventions, in particular when fused with other modalities. Multi-modal image registration algorithms however require a somewhat accurate initialization, which is particularly difficult to estimate for ultrasound images as their orientation is arbitrary and their content ambiguous (limited field of view, artifacts, etc.). In this work, we not only train neural networks to predict the absolute orientation of ultrasound frames, but also to produce a confidence for each prediction. This allows us to select only the most confident frames in the clip. Our networks are trained to produce multiple hypotheses using a simple yet overlooked meta-loss that is specifically designed to capture the ambiguity of the input data. We show on several abdominal ultrasound datasets that multi-hypotheses networks provide better uncertainty estimates than Monte-Carlo dropout while being more efficient than network ensembling. Generic, easy to implement and able to quantify both data ambiguity and out-of-distribution samples, they represent a preferable alternative to traditional baselines for uncertainty estimation. On a clinical test our method produces estimates within $20^{\circ}$ of the true orientation, which we can use to improve the accuracy of a subsequent registration algorithm down to less than $10^{\circ}$.
APA
Horstmann, T., Zettinig, O., Wein, W. & Prevost, R.. (2022). Orientation Estimation of Abdominal Ultrasound Images with Multi-Hypotheses Networks. Proceedings of The 5th International Conference on Medical Imaging with Deep Learning, in Proceedings of Machine Learning Research 172:521-534 Available from https://proceedings.mlr.press/v172/horstmann22a.html.

Related Material