A Bayesian framework for calibrating Gaussian process predictive distributions

Aurélien Pion, Emmanuel Vazquez
Proceedings of the Fourteenth Symposium on Conformal and Probabilistic Prediction with Applications, PMLR 266:748-750, 2025.

Abstract

Gaussian processes (GPs) provide principled uncertainty quantification through posterior predictive distributions. However, these distributions may be miscalibrated in practice when hyperparameters are estimated from data. This miscalibration can lead to unreliable decisions in downstream tasks. In this work, we propose calGP, a Bayesian calibration method that retains the GP posterior mean while modeling the normalized prediction error with a generalized normal distribution. The shape and scale parameters of this distribution are selected using a posterior sampling strategy guided by PIT-based calibration metrics. The resulting predictive distribution supports continuous confidence levels and improves tail behavior without retraining the underlying GP. We also introduce KS-PIT, a scalar diagnostic based on the Kolmogorov–Smirnov distance between PIT values and the uniform distribution. Numerical experiments demonstrate that calGP achieves better calibration than standard GP models, with controllable conservativeness and interpretable diagnostics.

Cite this Paper


BibTeX
@InProceedings{pmlr-v266-pion25a, title = {A Bayesian framework for calibrating Gaussian process predictive distributions}, author = {Pion, Aur\'{e}lien and Vazquez, Emmanuel}, booktitle = {Proceedings of the Fourteenth Symposium on Conformal and Probabilistic Prediction with Applications}, pages = {748--750}, year = {2025}, editor = {Nguyen, Khuong An and Luo, Zhiyuan and Papadopoulos, Harris and Löfström, Tuwe and Carlsson, Lars and Boström, Henrik}, volume = {266}, series = {Proceedings of Machine Learning Research}, month = {10--12 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v266/main/assets/pion25a/pion25a.pdf}, url = {https://proceedings.mlr.press/v266/pion25a.html}, abstract = {Gaussian processes (GPs) provide principled uncertainty quantification through posterior predictive distributions. However, these distributions may be miscalibrated in practice when hyperparameters are estimated from data. This miscalibration can lead to unreliable decisions in downstream tasks. In this work, we propose calGP, a Bayesian calibration method that retains the GP posterior mean while modeling the normalized prediction error with a generalized normal distribution. The shape and scale parameters of this distribution are selected using a posterior sampling strategy guided by PIT-based calibration metrics. The resulting predictive distribution supports continuous confidence levels and improves tail behavior without retraining the underlying GP. We also introduce KS-PIT, a scalar diagnostic based on the Kolmogorov–Smirnov distance between PIT values and the uniform distribution. Numerical experiments demonstrate that calGP achieves better calibration than standard GP models, with controllable conservativeness and interpretable diagnostics.} }
Endnote
%0 Conference Paper %T A Bayesian framework for calibrating Gaussian process predictive distributions %A Aurélien Pion %A Emmanuel Vazquez %B Proceedings of the Fourteenth Symposium on Conformal and Probabilistic Prediction with Applications %C Proceedings of Machine Learning Research %D 2025 %E Khuong An Nguyen %E Zhiyuan Luo %E Harris Papadopoulos %E Tuwe Löfström %E Lars Carlsson %E Henrik Boström %F pmlr-v266-pion25a %I PMLR %P 748--750 %U https://proceedings.mlr.press/v266/pion25a.html %V 266 %X Gaussian processes (GPs) provide principled uncertainty quantification through posterior predictive distributions. However, these distributions may be miscalibrated in practice when hyperparameters are estimated from data. This miscalibration can lead to unreliable decisions in downstream tasks. In this work, we propose calGP, a Bayesian calibration method that retains the GP posterior mean while modeling the normalized prediction error with a generalized normal distribution. The shape and scale parameters of this distribution are selected using a posterior sampling strategy guided by PIT-based calibration metrics. The resulting predictive distribution supports continuous confidence levels and improves tail behavior without retraining the underlying GP. We also introduce KS-PIT, a scalar diagnostic based on the Kolmogorov–Smirnov distance between PIT values and the uniform distribution. Numerical experiments demonstrate that calGP achieves better calibration than standard GP models, with controllable conservativeness and interpretable diagnostics.
APA
Pion, A. & Vazquez, E.. (2025). A Bayesian framework for calibrating Gaussian process predictive distributions. Proceedings of the Fourteenth Symposium on Conformal and Probabilistic Prediction with Applications, in Proceedings of Machine Learning Research 266:748-750 Available from https://proceedings.mlr.press/v266/pion25a.html.

Related Material