WSI-BayesUNet: Uncertainty-Aware Deep Learning for Histopathological Image Segmentation with Active Learning

Yijun Cui, Geert Litjens, Khalili Nadieh
Proceedings of the MICCAI Workshop on Computational Pathology, PMLR 316:338-346, 2026.

Abstract

Histopathological image segmentation is a core task in digital pathology, supporting applications such as cancer detection and subtype classification. Manual annotation is time-consuming and subjective, making automation essential for improving efficiency and consistency in diagnostic workflows. Although deep learning models have significantly automated this process, they still make silent mistakes. Quantifying the uncertainty of the model and using the uncertainty for further improvement is not fully addressed. The most common way to quantify uncertainty is through ensemble methods, which provide empirical uncertainty estimation but face limitations, including high computational costs and theoretical instability. To address these, we propose a Bayesian U-Net framework that employs variational inference for principled probabilistic uncertainty estimation. Leveraging active learning, our Bayesian U-Net iteratively improves segmentation performance by prioritizing the most uncertain samples. Experiments on the TIGER and CAMELYON17 datasets show that Bayesian U-Net outperforms ensemble methods, offering better uncertainty quantification, uncertainty-guided performance gains, and faster convergence. Notably, uncertainty-based sampling consistently surpasses random sampling, significantly reducing annotation effort while maintaining or improving segmentation accuracy.

Cite this Paper


BibTeX
@InProceedings{pmlr-v316-cui26a, title = {WSI-BayesUNet: Uncertainty-Aware Deep Learning for Histopathological Image Segmentation with Active Learning}, author = {Cui, Yijun and Litjens, Geert and Nadieh, Khalili}, booktitle = {Proceedings of the MICCAI Workshop on Computational Pathology}, pages = {338--346}, year = {2026}, editor = {Studer, Linda and Ciompi, Francesco and Khalili, Nadieh and Faryna, Khrystyna and Faryna, Khrystyna and Yeong, Joe and Lau, Mai Chan and Chen, Hao and Liu, Ziyi and Brattoli, Biagio}, volume = {316}, series = {Proceedings of Machine Learning Research}, month = {27 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v316/main/assets/cui26a/cui26a.pdf}, url = {https://proceedings.mlr.press/v316/cui26a.html}, abstract = {Histopathological image segmentation is a core task in digital pathology, supporting applications such as cancer detection and subtype classification. Manual annotation is time-consuming and subjective, making automation essential for improving efficiency and consistency in diagnostic workflows. Although deep learning models have significantly automated this process, they still make silent mistakes. Quantifying the uncertainty of the model and using the uncertainty for further improvement is not fully addressed. The most common way to quantify uncertainty is through ensemble methods, which provide empirical uncertainty estimation but face limitations, including high computational costs and theoretical instability. To address these, we propose a Bayesian U-Net framework that employs variational inference for principled probabilistic uncertainty estimation. Leveraging active learning, our Bayesian U-Net iteratively improves segmentation performance by prioritizing the most uncertain samples. Experiments on the TIGER and CAMELYON17 datasets show that Bayesian U-Net outperforms ensemble methods, offering better uncertainty quantification, uncertainty-guided performance gains, and faster convergence. Notably, uncertainty-based sampling consistently surpasses random sampling, significantly reducing annotation effort while maintaining or improving segmentation accuracy.} }
Endnote
%0 Conference Paper %T WSI-BayesUNet: Uncertainty-Aware Deep Learning for Histopathological Image Segmentation with Active Learning %A Yijun Cui %A Geert Litjens %A Khalili Nadieh %B Proceedings of the MICCAI Workshop on Computational Pathology %C Proceedings of Machine Learning Research %D 2026 %E Linda Studer %E Francesco Ciompi %E Nadieh Khalili %E Khrystyna Faryna %E Khrystyna Faryna %E Joe Yeong %E Mai Chan Lau %E Hao Chen %E Ziyi Liu %E Biagio Brattoli %F pmlr-v316-cui26a %I PMLR %P 338--346 %U https://proceedings.mlr.press/v316/cui26a.html %V 316 %X Histopathological image segmentation is a core task in digital pathology, supporting applications such as cancer detection and subtype classification. Manual annotation is time-consuming and subjective, making automation essential for improving efficiency and consistency in diagnostic workflows. Although deep learning models have significantly automated this process, they still make silent mistakes. Quantifying the uncertainty of the model and using the uncertainty for further improvement is not fully addressed. The most common way to quantify uncertainty is through ensemble methods, which provide empirical uncertainty estimation but face limitations, including high computational costs and theoretical instability. To address these, we propose a Bayesian U-Net framework that employs variational inference for principled probabilistic uncertainty estimation. Leveraging active learning, our Bayesian U-Net iteratively improves segmentation performance by prioritizing the most uncertain samples. Experiments on the TIGER and CAMELYON17 datasets show that Bayesian U-Net outperforms ensemble methods, offering better uncertainty quantification, uncertainty-guided performance gains, and faster convergence. Notably, uncertainty-based sampling consistently surpasses random sampling, significantly reducing annotation effort while maintaining or improving segmentation accuracy.
APA
Cui, Y., Litjens, G. & Nadieh, K.. (2026). WSI-BayesUNet: Uncertainty-Aware Deep Learning for Histopathological Image Segmentation with Active Learning. Proceedings of the MICCAI Workshop on Computational Pathology, in Proceedings of Machine Learning Research 316:338-346 Available from https://proceedings.mlr.press/v316/cui26a.html.

Related Material