Nonparametric Gaussian Process Covariances via Multidimensional Convolutions

Thomas M. Mcdonald, Magnus Ross, Michael T. Smith, Mauricio A. Álvarez
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:8279-8293, 2023.

Abstract

A key challenge in the practical application of Gaussian processes (GPs) is selecting a proper covariance function. The process convolutions construction of GPs allows some additional flexibility, but still requires choosing a proper smoothing kernel, which is non-trivial. Previous approaches have built covariance functions by using GP priors over the smoothing kernel, and by extension the covariance, as a way to bypass the need to specify it in advance. However, these models have been limited in several ways: they are restricted to single dimensional inputs, e.g. time; they only allow modelling of single outputs and they do not scale to large datasets since inference is not straightforward. In this paper, we introduce a nonparametric process convolution formulation for GPs that alleviates these weaknesses. We achieve this using a functional sampling approach based on Matheron’s rule to perform fast sampling using interdomain inducing variables. We test the performance of our model on benchmarks for single output, multi-output and large-scale GP regression, and find that our approach can provide improvements over standard GP models, particularly for larger datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-mcdonald23a, title = {Nonparametric Gaussian Process Covariances via Multidimensional Convolutions}, author = {Mcdonald, Thomas M. and Ross, Magnus and Smith, Michael T. and \'Alvarez, Mauricio A.}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {8279--8293}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/mcdonald23a/mcdonald23a.pdf}, url = {https://proceedings.mlr.press/v206/mcdonald23a.html}, abstract = {A key challenge in the practical application of Gaussian processes (GPs) is selecting a proper covariance function. The process convolutions construction of GPs allows some additional flexibility, but still requires choosing a proper smoothing kernel, which is non-trivial. Previous approaches have built covariance functions by using GP priors over the smoothing kernel, and by extension the covariance, as a way to bypass the need to specify it in advance. However, these models have been limited in several ways: they are restricted to single dimensional inputs, e.g. time; they only allow modelling of single outputs and they do not scale to large datasets since inference is not straightforward. In this paper, we introduce a nonparametric process convolution formulation for GPs that alleviates these weaknesses. We achieve this using a functional sampling approach based on Matheron’s rule to perform fast sampling using interdomain inducing variables. We test the performance of our model on benchmarks for single output, multi-output and large-scale GP regression, and find that our approach can provide improvements over standard GP models, particularly for larger datasets.} }
Endnote
%0 Conference Paper %T Nonparametric Gaussian Process Covariances via Multidimensional Convolutions %A Thomas M. Mcdonald %A Magnus Ross %A Michael T. Smith %A Mauricio A. Álvarez %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-mcdonald23a %I PMLR %P 8279--8293 %U https://proceedings.mlr.press/v206/mcdonald23a.html %V 206 %X A key challenge in the practical application of Gaussian processes (GPs) is selecting a proper covariance function. The process convolutions construction of GPs allows some additional flexibility, but still requires choosing a proper smoothing kernel, which is non-trivial. Previous approaches have built covariance functions by using GP priors over the smoothing kernel, and by extension the covariance, as a way to bypass the need to specify it in advance. However, these models have been limited in several ways: they are restricted to single dimensional inputs, e.g. time; they only allow modelling of single outputs and they do not scale to large datasets since inference is not straightforward. In this paper, we introduce a nonparametric process convolution formulation for GPs that alleviates these weaknesses. We achieve this using a functional sampling approach based on Matheron’s rule to perform fast sampling using interdomain inducing variables. We test the performance of our model on benchmarks for single output, multi-output and large-scale GP regression, and find that our approach can provide improvements over standard GP models, particularly for larger datasets.
APA
Mcdonald, T.M., Ross, M., Smith, M.T. & Álvarez, M.A.. (2023). Nonparametric Gaussian Process Covariances via Multidimensional Convolutions. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:8279-8293 Available from https://proceedings.mlr.press/v206/mcdonald23a.html.

Related Material