Calibrated Computation-Aware Gaussian Processes

Disha Hegde, Mohamed Adil, Jon Cockayne
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:2098-2106, 2025.

Abstract

Gaussian processes are notorious for scaling cubically with the size of the training set, preventing application to very large regression problems. Computation-aware Gaussian processes (CAGPs) tackle this scaling issue by exploiting probabilistic linear solvers to reduce complexity, widening the posterior with additional \emph{computational} uncertainty due to reduced computation. However, the most commonly used CAGP framework results in (sometimes dramatically) conservative uncertainty quantification, making the posterior difficult to use in practice. In this work, we prove that if the utilised probabilistic linear solver is \emph{calibrated}, in a rigorous statistical sense, then so too is the induced CAGP. We thus propose a new CAGP framework, CAGP-GS, based on using Gauss-Seidel iterations for the underlying probabilistic linear solver. CAGP-GS performs favourably compared to existing approaches when the test set is low-dimensional and few iterations are performed. We test the calibratedness on a synthetic problem, and compare the performance to existing approaches on a large-scale global temperature regression problem.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-hegde25a, title = {Calibrated Computation-Aware Gaussian Processes}, author = {Hegde, Disha and Adil, Mohamed and Cockayne, Jon}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {2098--2106}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/hegde25a/hegde25a.pdf}, url = {https://proceedings.mlr.press/v258/hegde25a.html}, abstract = {Gaussian processes are notorious for scaling cubically with the size of the training set, preventing application to very large regression problems. Computation-aware Gaussian processes (CAGPs) tackle this scaling issue by exploiting probabilistic linear solvers to reduce complexity, widening the posterior with additional \emph{computational} uncertainty due to reduced computation. However, the most commonly used CAGP framework results in (sometimes dramatically) conservative uncertainty quantification, making the posterior difficult to use in practice. In this work, we prove that if the utilised probabilistic linear solver is \emph{calibrated}, in a rigorous statistical sense, then so too is the induced CAGP. We thus propose a new CAGP framework, CAGP-GS, based on using Gauss-Seidel iterations for the underlying probabilistic linear solver. CAGP-GS performs favourably compared to existing approaches when the test set is low-dimensional and few iterations are performed. We test the calibratedness on a synthetic problem, and compare the performance to existing approaches on a large-scale global temperature regression problem.} }
Endnote
%0 Conference Paper %T Calibrated Computation-Aware Gaussian Processes %A Disha Hegde %A Mohamed Adil %A Jon Cockayne %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-hegde25a %I PMLR %P 2098--2106 %U https://proceedings.mlr.press/v258/hegde25a.html %V 258 %X Gaussian processes are notorious for scaling cubically with the size of the training set, preventing application to very large regression problems. Computation-aware Gaussian processes (CAGPs) tackle this scaling issue by exploiting probabilistic linear solvers to reduce complexity, widening the posterior with additional \emph{computational} uncertainty due to reduced computation. However, the most commonly used CAGP framework results in (sometimes dramatically) conservative uncertainty quantification, making the posterior difficult to use in practice. In this work, we prove that if the utilised probabilistic linear solver is \emph{calibrated}, in a rigorous statistical sense, then so too is the induced CAGP. We thus propose a new CAGP framework, CAGP-GS, based on using Gauss-Seidel iterations for the underlying probabilistic linear solver. CAGP-GS performs favourably compared to existing approaches when the test set is low-dimensional and few iterations are performed. We test the calibratedness on a synthetic problem, and compare the performance to existing approaches on a large-scale global temperature regression problem.
APA
Hegde, D., Adil, M. & Cockayne, J.. (2025). Calibrated Computation-Aware Gaussian Processes. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:2098-2106 Available from https://proceedings.mlr.press/v258/hegde25a.html.

Related Material