Accelerating Look-ahead in Bayesian Optimization: Multilevel Monte Carlo is All you Need

Shangda Yang, Vitaly Zankin, Maximilian Balandat, Stefan Scherer, Kevin Thomas Carlberg, Neil Walton, Kody J. H. Law
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:56722-56748, 2024.

Abstract

We leverage multilevel Monte Carlo (MLMC) to improve the performance of multi-step look- ahead Bayesian optimization (BO) methods that involve nested expectations and maximizations. Often these expectations must be computed by Monte Carlo (MC). The complexity rate of naive MC degrades for nested operations, whereas MLMC is capable of achieving the canonical MC convergence rate for this type of problem, independently of dimension and without any smoothness assumptions. Our theoretical study focuses on the approximation improvements for two- and three-step look-ahead acquisition functions, but, as we discuss, the approach is generalizable in various ways, including beyond the context of BO. Our findings are verified numerically and the benefits of MLMC for BO are illustrated on several benchmark examples. Code is available at https://github.com/Shangda-Yang/MLMCBO.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-yang24aj, title = {Accelerating Look-ahead in {B}ayesian Optimization: Multilevel {M}onte {C}arlo is All you Need}, author = {Yang, Shangda and Zankin, Vitaly and Balandat, Maximilian and Scherer, Stefan and Carlberg, Kevin Thomas and Walton, Neil and Law, Kody J. H.}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {56722--56748}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/yang24aj/yang24aj.pdf}, url = {https://proceedings.mlr.press/v235/yang24aj.html}, abstract = {We leverage multilevel Monte Carlo (MLMC) to improve the performance of multi-step look- ahead Bayesian optimization (BO) methods that involve nested expectations and maximizations. Often these expectations must be computed by Monte Carlo (MC). The complexity rate of naive MC degrades for nested operations, whereas MLMC is capable of achieving the canonical MC convergence rate for this type of problem, independently of dimension and without any smoothness assumptions. Our theoretical study focuses on the approximation improvements for two- and three-step look-ahead acquisition functions, but, as we discuss, the approach is generalizable in various ways, including beyond the context of BO. Our findings are verified numerically and the benefits of MLMC for BO are illustrated on several benchmark examples. Code is available at https://github.com/Shangda-Yang/MLMCBO.} }
Endnote
%0 Conference Paper %T Accelerating Look-ahead in Bayesian Optimization: Multilevel Monte Carlo is All you Need %A Shangda Yang %A Vitaly Zankin %A Maximilian Balandat %A Stefan Scherer %A Kevin Thomas Carlberg %A Neil Walton %A Kody J. H. Law %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-yang24aj %I PMLR %P 56722--56748 %U https://proceedings.mlr.press/v235/yang24aj.html %V 235 %X We leverage multilevel Monte Carlo (MLMC) to improve the performance of multi-step look- ahead Bayesian optimization (BO) methods that involve nested expectations and maximizations. Often these expectations must be computed by Monte Carlo (MC). The complexity rate of naive MC degrades for nested operations, whereas MLMC is capable of achieving the canonical MC convergence rate for this type of problem, independently of dimension and without any smoothness assumptions. Our theoretical study focuses on the approximation improvements for two- and three-step look-ahead acquisition functions, but, as we discuss, the approach is generalizable in various ways, including beyond the context of BO. Our findings are verified numerically and the benefits of MLMC for BO are illustrated on several benchmark examples. Code is available at https://github.com/Shangda-Yang/MLMCBO.
APA
Yang, S., Zankin, V., Balandat, M., Scherer, S., Carlberg, K.T., Walton, N. & Law, K.J.H.. (2024). Accelerating Look-ahead in Bayesian Optimization: Multilevel Monte Carlo is All you Need. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:56722-56748 Available from https://proceedings.mlr.press/v235/yang24aj.html.

Related Material