Randomised Postiterations for Calibrated BayesCG

Niall Vyas, Disha Hegde, Jon Cockayne
Proceedings of the First International Conference on Probabilistic Numerics, PMLR 271:75-83, 2025.

Abstract

The Bayesian conjugate gradient method offers probabilistic solutions to linear systems but suffers from poor calibration, limiting its utility in uncertainty quantification tasks. Recent approaches leveraging postiterations to construct priors have improved computational properties but failed to correct calibration issues. In this work, we propose a novel randomised postiteration strategy that enhances the calibration of the BayesCG posterior while preserving its favourable convergence characteristics. We present theoretical guarantees for the improved calibration, supported by results on the distribution of posterior errors. Numerical experiments demonstrate the efficacy of the method in both synthetic and inverse problem settings, showing enhanced uncertainty quantification and better propagation of uncertainties through computational pipelines.

Cite this Paper


BibTeX
@InProceedings{pmlr-v271-vyas25a, title = {Randomised Postiterations for Calibrated {B}ayesCG}, author = {Vyas, Niall and Hegde, Disha and Cockayne, Jon}, booktitle = {Proceedings of the First International Conference on Probabilistic Numerics}, pages = {75--83}, year = {2025}, editor = {Kanagawa, Motonobu and Cockayne, Jon and Gessner, Alexandra and Hennig, Philipp}, volume = {271}, series = {Proceedings of Machine Learning Research}, month = {01--03 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v271/main/assets/vyas25a/vyas25a.pdf}, url = {https://proceedings.mlr.press/v271/vyas25a.html}, abstract = {The Bayesian conjugate gradient method offers probabilistic solutions to linear systems but suffers from poor calibration, limiting its utility in uncertainty quantification tasks. Recent approaches leveraging postiterations to construct priors have improved computational properties but failed to correct calibration issues. In this work, we propose a novel randomised postiteration strategy that enhances the calibration of the BayesCG posterior while preserving its favourable convergence characteristics. We present theoretical guarantees for the improved calibration, supported by results on the distribution of posterior errors. Numerical experiments demonstrate the efficacy of the method in both synthetic and inverse problem settings, showing enhanced uncertainty quantification and better propagation of uncertainties through computational pipelines.} }
Endnote
%0 Conference Paper %T Randomised Postiterations for Calibrated BayesCG %A Niall Vyas %A Disha Hegde %A Jon Cockayne %B Proceedings of the First International Conference on Probabilistic Numerics %C Proceedings of Machine Learning Research %D 2025 %E Motonobu Kanagawa %E Jon Cockayne %E Alexandra Gessner %E Philipp Hennig %F pmlr-v271-vyas25a %I PMLR %P 75--83 %U https://proceedings.mlr.press/v271/vyas25a.html %V 271 %X The Bayesian conjugate gradient method offers probabilistic solutions to linear systems but suffers from poor calibration, limiting its utility in uncertainty quantification tasks. Recent approaches leveraging postiterations to construct priors have improved computational properties but failed to correct calibration issues. In this work, we propose a novel randomised postiteration strategy that enhances the calibration of the BayesCG posterior while preserving its favourable convergence characteristics. We present theoretical guarantees for the improved calibration, supported by results on the distribution of posterior errors. Numerical experiments demonstrate the efficacy of the method in both synthetic and inverse problem settings, showing enhanced uncertainty quantification and better propagation of uncertainties through computational pipelines.
APA
Vyas, N., Hegde, D. & Cockayne, J.. (2025). Randomised Postiterations for Calibrated BayesCG. Proceedings of the First International Conference on Probabilistic Numerics, in Proceedings of Machine Learning Research 271:75-83 Available from https://proceedings.mlr.press/v271/vyas25a.html.

Related Material