Equivariant bootstrapping for uncertainty quantification in imaging inverse problems

Marcelo Pereyra, Julián Tachella
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:4141-4149, 2024.

Abstract

Scientific imaging problems are often severely ill-posed and hence have significant intrinsic uncertainty. Accurately quantifying the uncertainty in the solutions to such problems is therefore critical for the rigorous interpretation of experimental results as well as for reliably using the reconstructed images as scientific evidence. Unfortunately, existing imaging methods are unable to quantify the uncertainty in the reconstructed images in a way that is robust to experiment replications. This paper presents a new uncertainty quantification methodology based on an equivariant formulation of the parametric bootstrap algorithm that leverages symmetries and invariance properties commonly encountered in imaging problems. Additionally, the proposed methodology is general and can be easily applied with any image reconstruction technique, including unsupervised training strategies that can be trained from observed data alone, thus enabling uncertainty quantification in situations where there is no ground truth data available. We demonstrate the proposed approach with a series of experiments and comparisons with alternative state-of-the-art uncertainty quantification strategies. In all our experiments, the proposed equivariant bootstrap delivers remarkably accurate high-dimensional confidence regions and outperforms the competing approaches in terms of estimation accuracy, uncertainty quantification accuracy, and computing time. These empirical findings are supported by a detailed theoretical analysis of equivariant bootstrap for linear estimators.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-pereyra24a, title = { Equivariant bootstrapping for uncertainty quantification in imaging inverse problems }, author = {Pereyra, Marcelo and Tachella, Juli\'{a}n}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {4141--4149}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/pereyra24a/pereyra24a.pdf}, url = {https://proceedings.mlr.press/v238/pereyra24a.html}, abstract = { Scientific imaging problems are often severely ill-posed and hence have significant intrinsic uncertainty. Accurately quantifying the uncertainty in the solutions to such problems is therefore critical for the rigorous interpretation of experimental results as well as for reliably using the reconstructed images as scientific evidence. Unfortunately, existing imaging methods are unable to quantify the uncertainty in the reconstructed images in a way that is robust to experiment replications. This paper presents a new uncertainty quantification methodology based on an equivariant formulation of the parametric bootstrap algorithm that leverages symmetries and invariance properties commonly encountered in imaging problems. Additionally, the proposed methodology is general and can be easily applied with any image reconstruction technique, including unsupervised training strategies that can be trained from observed data alone, thus enabling uncertainty quantification in situations where there is no ground truth data available. We demonstrate the proposed approach with a series of experiments and comparisons with alternative state-of-the-art uncertainty quantification strategies. In all our experiments, the proposed equivariant bootstrap delivers remarkably accurate high-dimensional confidence regions and outperforms the competing approaches in terms of estimation accuracy, uncertainty quantification accuracy, and computing time. These empirical findings are supported by a detailed theoretical analysis of equivariant bootstrap for linear estimators. } }
Endnote
%0 Conference Paper %T Equivariant bootstrapping for uncertainty quantification in imaging inverse problems %A Marcelo Pereyra %A Julián Tachella %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-pereyra24a %I PMLR %P 4141--4149 %U https://proceedings.mlr.press/v238/pereyra24a.html %V 238 %X Scientific imaging problems are often severely ill-posed and hence have significant intrinsic uncertainty. Accurately quantifying the uncertainty in the solutions to such problems is therefore critical for the rigorous interpretation of experimental results as well as for reliably using the reconstructed images as scientific evidence. Unfortunately, existing imaging methods are unable to quantify the uncertainty in the reconstructed images in a way that is robust to experiment replications. This paper presents a new uncertainty quantification methodology based on an equivariant formulation of the parametric bootstrap algorithm that leverages symmetries and invariance properties commonly encountered in imaging problems. Additionally, the proposed methodology is general and can be easily applied with any image reconstruction technique, including unsupervised training strategies that can be trained from observed data alone, thus enabling uncertainty quantification in situations where there is no ground truth data available. We demonstrate the proposed approach with a series of experiments and comparisons with alternative state-of-the-art uncertainty quantification strategies. In all our experiments, the proposed equivariant bootstrap delivers remarkably accurate high-dimensional confidence regions and outperforms the competing approaches in terms of estimation accuracy, uncertainty quantification accuracy, and computing time. These empirical findings are supported by a detailed theoretical analysis of equivariant bootstrap for linear estimators.
APA
Pereyra, M. & Tachella, J.. (2024). Equivariant bootstrapping for uncertainty quantification in imaging inverse problems . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:4141-4149 Available from https://proceedings.mlr.press/v238/pereyra24a.html.

Related Material