Fast Fourier Bayesian Quadrature

Houston Warren, Fabio Ramos
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:4555-4563, 2024.

Abstract

In numerical integration, Bayesian quadrature (BQ) excels at producing estimates with quantified uncertainties, particularly in sparse data settings. However, its computational scalability and kernel learning capabilities have lagged behind modern advances in Gaussian process research. To bridge this gap, we recast the BQ posterior integral as a convolution operation, which enables efficient computation via fast Fourier transform of low-rank matrices. We introduce two new methods enabled by recasting BQ as a convolution: fast Fourier Bayesian quadrature and sparse spectrum Bayesian quadrature. These methods enhance the computational scalability of BQ and expand kernel flexibility, enabling the use of \textit{any} stationary kernel in the BQ setting. We empirically validate the efficacy of our approach through a range of integration tasks, substantiating the benefits of the proposed methodology.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-warren24a, title = {Fast {F}ourier {B}ayesian Quadrature}, author = {Warren, Houston and Ramos, Fabio}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {4555--4563}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/warren24a/warren24a.pdf}, url = {https://proceedings.mlr.press/v238/warren24a.html}, abstract = {In numerical integration, Bayesian quadrature (BQ) excels at producing estimates with quantified uncertainties, particularly in sparse data settings. However, its computational scalability and kernel learning capabilities have lagged behind modern advances in Gaussian process research. To bridge this gap, we recast the BQ posterior integral as a convolution operation, which enables efficient computation via fast Fourier transform of low-rank matrices. We introduce two new methods enabled by recasting BQ as a convolution: fast Fourier Bayesian quadrature and sparse spectrum Bayesian quadrature. These methods enhance the computational scalability of BQ and expand kernel flexibility, enabling the use of \textit{any} stationary kernel in the BQ setting. We empirically validate the efficacy of our approach through a range of integration tasks, substantiating the benefits of the proposed methodology.} }
Endnote
%0 Conference Paper %T Fast Fourier Bayesian Quadrature %A Houston Warren %A Fabio Ramos %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-warren24a %I PMLR %P 4555--4563 %U https://proceedings.mlr.press/v238/warren24a.html %V 238 %X In numerical integration, Bayesian quadrature (BQ) excels at producing estimates with quantified uncertainties, particularly in sparse data settings. However, its computational scalability and kernel learning capabilities have lagged behind modern advances in Gaussian process research. To bridge this gap, we recast the BQ posterior integral as a convolution operation, which enables efficient computation via fast Fourier transform of low-rank matrices. We introduce two new methods enabled by recasting BQ as a convolution: fast Fourier Bayesian quadrature and sparse spectrum Bayesian quadrature. These methods enhance the computational scalability of BQ and expand kernel flexibility, enabling the use of \textit{any} stationary kernel in the BQ setting. We empirically validate the efficacy of our approach through a range of integration tasks, substantiating the benefits of the proposed methodology.
APA
Warren, H. & Ramos, F.. (2024). Fast Fourier Bayesian Quadrature. Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:4555-4563 Available from https://proceedings.mlr.press/v238/warren24a.html.

Related Material