Modulating Surrogates for Bayesian Optimization

Erik Bodin, Markus Kaiser, Ieva Kazlauskaite, Zhenwen Dai, Neill Campbell, Carl Henrik Ek
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:970-979, 2020.

Abstract

Bayesian optimization (BO) methods often rely on the assumption that the objective function is well-behaved, but in practice, this is seldom true for real-world objectives even if noise-free observations can be collected. Common approaches, which try to model the objective as precisely as possible, often fail to make progress by spending too many evaluations modeling irrelevant details. We address this issue by proposing surrogate models that focus on the well-behaved structure in the objective function, which is informative for search, while ignoring detrimental structure that is challenging to model from few observations. First, we demonstrate that surrogate models with appropriate noise distributions can absorb challenging structures in the objective function by treating them as irreducible uncertainty. Secondly, we show that a latent Gaussian process is an excellent surrogate for this purpose, comparing with Gaussian processes with standard noise distributions. We perform numerous experiments on a range of BO benchmarks and find that our approach improves reliability and performance when faced with challenging objective functions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-bodin20a, title = {Modulating Surrogates for {B}ayesian Optimization}, author = {Bodin, Erik and Kaiser, Markus and Kazlauskaite, Ieva and Dai, Zhenwen and Campbell, Neill and Ek, Carl Henrik}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {970--979}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/bodin20a/bodin20a.pdf}, url = {https://proceedings.mlr.press/v119/bodin20a.html}, abstract = {Bayesian optimization (BO) methods often rely on the assumption that the objective function is well-behaved, but in practice, this is seldom true for real-world objectives even if noise-free observations can be collected. Common approaches, which try to model the objective as precisely as possible, often fail to make progress by spending too many evaluations modeling irrelevant details. We address this issue by proposing surrogate models that focus on the well-behaved structure in the objective function, which is informative for search, while ignoring detrimental structure that is challenging to model from few observations. First, we demonstrate that surrogate models with appropriate noise distributions can absorb challenging structures in the objective function by treating them as irreducible uncertainty. Secondly, we show that a latent Gaussian process is an excellent surrogate for this purpose, comparing with Gaussian processes with standard noise distributions. We perform numerous experiments on a range of BO benchmarks and find that our approach improves reliability and performance when faced with challenging objective functions.} }
Endnote
%0 Conference Paper %T Modulating Surrogates for Bayesian Optimization %A Erik Bodin %A Markus Kaiser %A Ieva Kazlauskaite %A Zhenwen Dai %A Neill Campbell %A Carl Henrik Ek %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-bodin20a %I PMLR %P 970--979 %U https://proceedings.mlr.press/v119/bodin20a.html %V 119 %X Bayesian optimization (BO) methods often rely on the assumption that the objective function is well-behaved, but in practice, this is seldom true for real-world objectives even if noise-free observations can be collected. Common approaches, which try to model the objective as precisely as possible, often fail to make progress by spending too many evaluations modeling irrelevant details. We address this issue by proposing surrogate models that focus on the well-behaved structure in the objective function, which is informative for search, while ignoring detrimental structure that is challenging to model from few observations. First, we demonstrate that surrogate models with appropriate noise distributions can absorb challenging structures in the objective function by treating them as irreducible uncertainty. Secondly, we show that a latent Gaussian process is an excellent surrogate for this purpose, comparing with Gaussian processes with standard noise distributions. We perform numerous experiments on a range of BO benchmarks and find that our approach improves reliability and performance when faced with challenging objective functions.
APA
Bodin, E., Kaiser, M., Kazlauskaite, I., Dai, Z., Campbell, N. & Ek, C.H.. (2020). Modulating Surrogates for Bayesian Optimization. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:970-979 Available from https://proceedings.mlr.press/v119/bodin20a.html.

Related Material