Inducing Point Allocation for Sparse Gaussian Processes in High-Throughput Bayesian Optimisation

Henry B. Moss, Sebastian W. Ober, Victor Picheny
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:5213-5230, 2023.

Abstract

Sparse Gaussian processes are a key component of high-throughput Bayesian optimisation (BO) loops; however, we show that existing methods for allocating their inducing points severely hamper optimisation performance. By exploiting the quality-diversity decomposition of determinantal point processes, we propose the first inducing point allocation strategy designed specifically for use in BO. Unlike existing methods which seek only to reduce global uncertainty in the objective function, our approach provides the local high-fidelity modelling of promising regions required for precise optimisation. More generally, we demonstrate that our proposed framework provides a flexible way to allocate modelling capacity in sparse models and so is suitable for a broad range of downstream sequential decision making tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-moss23a, title = {Inducing Point Allocation for Sparse Gaussian Processes in High-Throughput Bayesian Optimisation}, author = {Moss, Henry B. and Ober, Sebastian W. and Picheny, Victor}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {5213--5230}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/moss23a/moss23a.pdf}, url = {https://proceedings.mlr.press/v206/moss23a.html}, abstract = {Sparse Gaussian processes are a key component of high-throughput Bayesian optimisation (BO) loops; however, we show that existing methods for allocating their inducing points severely hamper optimisation performance. By exploiting the quality-diversity decomposition of determinantal point processes, we propose the first inducing point allocation strategy designed specifically for use in BO. Unlike existing methods which seek only to reduce global uncertainty in the objective function, our approach provides the local high-fidelity modelling of promising regions required for precise optimisation. More generally, we demonstrate that our proposed framework provides a flexible way to allocate modelling capacity in sparse models and so is suitable for a broad range of downstream sequential decision making tasks.} }
Endnote
%0 Conference Paper %T Inducing Point Allocation for Sparse Gaussian Processes in High-Throughput Bayesian Optimisation %A Henry B. Moss %A Sebastian W. Ober %A Victor Picheny %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-moss23a %I PMLR %P 5213--5230 %U https://proceedings.mlr.press/v206/moss23a.html %V 206 %X Sparse Gaussian processes are a key component of high-throughput Bayesian optimisation (BO) loops; however, we show that existing methods for allocating their inducing points severely hamper optimisation performance. By exploiting the quality-diversity decomposition of determinantal point processes, we propose the first inducing point allocation strategy designed specifically for use in BO. Unlike existing methods which seek only to reduce global uncertainty in the objective function, our approach provides the local high-fidelity modelling of promising regions required for precise optimisation. More generally, we demonstrate that our proposed framework provides a flexible way to allocate modelling capacity in sparse models and so is suitable for a broad range of downstream sequential decision making tasks.
APA
Moss, H.B., Ober, S.W. & Picheny, V.. (2023). Inducing Point Allocation for Sparse Gaussian Processes in High-Throughput Bayesian Optimisation. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:5213-5230 Available from https://proceedings.mlr.press/v206/moss23a.html.

Related Material