[edit]
A Bayesian framework for calibrating Gaussian process predictive distributions
Proceedings of the Fourteenth Symposium on Conformal and Probabilistic Prediction with Applications, PMLR 266:748-750, 2025.
Abstract
Gaussian processes (GPs) provide principled uncertainty quantification through posterior predictive distributions. However, these distributions may be miscalibrated in practice when hyperparameters are estimated from data. This miscalibration can lead to unreliable decisions in downstream tasks. In this work, we propose calGP, a Bayesian calibration method that retains the GP posterior mean while modeling the normalized prediction error with a generalized normal distribution. The shape and scale parameters of this distribution are selected using a posterior sampling strategy guided by PIT-based calibration metrics. The resulting predictive distribution supports continuous confidence levels and improves tail behavior without retraining the underlying GP. We also introduce KS-PIT, a scalar diagnostic based on the Kolmogorov–Smirnov distance between PIT values and the uniform distribution. Numerical experiments demonstrate that calGP achieves better calibration than standard GP models, with controllable conservativeness and interpretable diagnostics.