MALIBO: Meta-learning for Likelihood-free Bayesian Optimization

Jiarong Pan, Stefan Falkner, Felix Berkenkamp, Joaquin Vanschoren
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:39102-39134, 2024.

Abstract

Bayesian optimization (BO) is a popular method to optimize costly black-box functions, and meta-learning has emerged as a way to leverage knowledge from related tasks to optimize new tasks faster. However, existing meta-learning methods for BO rely on surrogate models that are not scalable or are sensitive to varying input scales and noise types across tasks. Moreover, they often overlook the uncertainty associated with task similarity, leading to unreliable task adaptation when a new task differs significantly or has not been sufficiently explored yet. We propose a novel meta-learning BO approach that bypasses the surrogate model and directly learns the utility of queries across tasks. It explicitly models task uncertainty and includes an auxiliary model to enable robust adaptation to new tasks. Extensive experiments show that our method achieves strong performance and outperforms multiple meta-learning BO methods across various benchmarks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-pan24b, title = {{MALIBO}: Meta-learning for Likelihood-free {B}ayesian Optimization}, author = {Pan, Jiarong and Falkner, Stefan and Berkenkamp, Felix and Vanschoren, Joaquin}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {39102--39134}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/pan24b/pan24b.pdf}, url = {https://proceedings.mlr.press/v235/pan24b.html}, abstract = {Bayesian optimization (BO) is a popular method to optimize costly black-box functions, and meta-learning has emerged as a way to leverage knowledge from related tasks to optimize new tasks faster. However, existing meta-learning methods for BO rely on surrogate models that are not scalable or are sensitive to varying input scales and noise types across tasks. Moreover, they often overlook the uncertainty associated with task similarity, leading to unreliable task adaptation when a new task differs significantly or has not been sufficiently explored yet. We propose a novel meta-learning BO approach that bypasses the surrogate model and directly learns the utility of queries across tasks. It explicitly models task uncertainty and includes an auxiliary model to enable robust adaptation to new tasks. Extensive experiments show that our method achieves strong performance and outperforms multiple meta-learning BO methods across various benchmarks.} }
Endnote
%0 Conference Paper %T MALIBO: Meta-learning for Likelihood-free Bayesian Optimization %A Jiarong Pan %A Stefan Falkner %A Felix Berkenkamp %A Joaquin Vanschoren %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-pan24b %I PMLR %P 39102--39134 %U https://proceedings.mlr.press/v235/pan24b.html %V 235 %X Bayesian optimization (BO) is a popular method to optimize costly black-box functions, and meta-learning has emerged as a way to leverage knowledge from related tasks to optimize new tasks faster. However, existing meta-learning methods for BO rely on surrogate models that are not scalable or are sensitive to varying input scales and noise types across tasks. Moreover, they often overlook the uncertainty associated with task similarity, leading to unreliable task adaptation when a new task differs significantly or has not been sufficiently explored yet. We propose a novel meta-learning BO approach that bypasses the surrogate model and directly learns the utility of queries across tasks. It explicitly models task uncertainty and includes an auxiliary model to enable robust adaptation to new tasks. Extensive experiments show that our method achieves strong performance and outperforms multiple meta-learning BO methods across various benchmarks.
APA
Pan, J., Falkner, S., Berkenkamp, F. & Vanschoren, J.. (2024). MALIBO: Meta-learning for Likelihood-free Bayesian Optimization. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:39102-39134 Available from https://proceedings.mlr.press/v235/pan24b.html.

Related Material