Bayesian Optimization of Function Networks with Partial Evaluations

Poompol Buathong, Jiayue Wan, Raul Astudillo, Sam Daulton, Maximilian Balandat, Peter I. Frazier
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:4752-4784, 2024.

Abstract

Bayesian optimization is a powerful framework for optimizing functions that are expensive or time-consuming to evaluate. Recent work has considered Bayesian optimization of function networks (BOFN), where the objective function is given by a network of functions, each taking as input the output of previous nodes in the network as well as additional parameters. Leveraging this network structure has been shown to yield significant performance improvements. Existing BOFN algorithms for general-purpose networks evaluate the full network at each iteration. However, many real-world applications allow for evaluating nodes individually. To exploit this, we propose a novel knowledge gradient acquisition function that chooses which node and corresponding inputs to evaluate in a cost-aware manner, thereby reducing query costs by evaluating only on a part of the network at each step. We provide an efficient approach to optimizing our acquisition function and show that it outperforms existing BOFN methods and other benchmarks across several synthetic and real-world problems. Our acquisition function is the first to enable cost-aware optimization of a broad class of function networks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-buathong24a, title = {{B}ayesian Optimization of Function Networks with Partial Evaluations}, author = {Buathong, Poompol and Wan, Jiayue and Astudillo, Raul and Daulton, Sam and Balandat, Maximilian and Frazier, Peter I.}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {4752--4784}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/buathong24a/buathong24a.pdf}, url = {https://proceedings.mlr.press/v235/buathong24a.html}, abstract = {Bayesian optimization is a powerful framework for optimizing functions that are expensive or time-consuming to evaluate. Recent work has considered Bayesian optimization of function networks (BOFN), where the objective function is given by a network of functions, each taking as input the output of previous nodes in the network as well as additional parameters. Leveraging this network structure has been shown to yield significant performance improvements. Existing BOFN algorithms for general-purpose networks evaluate the full network at each iteration. However, many real-world applications allow for evaluating nodes individually. To exploit this, we propose a novel knowledge gradient acquisition function that chooses which node and corresponding inputs to evaluate in a cost-aware manner, thereby reducing query costs by evaluating only on a part of the network at each step. We provide an efficient approach to optimizing our acquisition function and show that it outperforms existing BOFN methods and other benchmarks across several synthetic and real-world problems. Our acquisition function is the first to enable cost-aware optimization of a broad class of function networks.} }
Endnote
%0 Conference Paper %T Bayesian Optimization of Function Networks with Partial Evaluations %A Poompol Buathong %A Jiayue Wan %A Raul Astudillo %A Sam Daulton %A Maximilian Balandat %A Peter I. Frazier %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-buathong24a %I PMLR %P 4752--4784 %U https://proceedings.mlr.press/v235/buathong24a.html %V 235 %X Bayesian optimization is a powerful framework for optimizing functions that are expensive or time-consuming to evaluate. Recent work has considered Bayesian optimization of function networks (BOFN), where the objective function is given by a network of functions, each taking as input the output of previous nodes in the network as well as additional parameters. Leveraging this network structure has been shown to yield significant performance improvements. Existing BOFN algorithms for general-purpose networks evaluate the full network at each iteration. However, many real-world applications allow for evaluating nodes individually. To exploit this, we propose a novel knowledge gradient acquisition function that chooses which node and corresponding inputs to evaluate in a cost-aware manner, thereby reducing query costs by evaluating only on a part of the network at each step. We provide an efficient approach to optimizing our acquisition function and show that it outperforms existing BOFN methods and other benchmarks across several synthetic and real-world problems. Our acquisition function is the first to enable cost-aware optimization of a broad class of function networks.
APA
Buathong, P., Wan, J., Astudillo, R., Daulton, S., Balandat, M. & Frazier, P.I.. (2024). Bayesian Optimization of Function Networks with Partial Evaluations. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:4752-4784 Available from https://proceedings.mlr.press/v235/buathong24a.html.

Related Material