Multi-fidelity Bayesian Optimization with Multiple Information Sources of Input-dependent Fidelity

Mingzhou Fan, Byung-Jun Yoon, Edward Dougherty, Nathan Urban, Francis Alexander, Raymundo Arróyave, Xiaoning Qian
Proceedings of the Fortieth Conference on Uncertainty in Artificial Intelligence, PMLR 244:1271-1293, 2024.

Abstract

By querying approximate surrogate models of different fidelity as available information sources, Multi-Fidelity Bayesian Optimization (MFBO) aims at optimizing unknown functions that are costly if not infeasible to evaluate. Existing MFBO methods often assume that approximate surrogates have consistently high/low fidelity across the input domain. However, approximate evaluations from the same surrogate can have different fidelity at different input regions due to data availability and model constraints, especially when considering machine learning surrogates. In this work, we investigate MFBO when multi-fidelity approximations have input-dependent fidelity. By explicitly capturing input dependency for multi-fidelity queries in Gaussian Process (GP), our new input-dependent MFBO (iMFBO) with learnable noise models better captures the fidelity of each information source in an intuitive way. We further design a new acquisition function for iMFBO and prove that the queries selected by iMFBO have higher quality than those by naive MFBO methods, with the derived sub-linear regret bound. Experiments on both synthetic and real-world data demonstrate its superior empirical performance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v244-fan24a, title = {Multi-fidelity Bayesian Optimization with Multiple Information Sources of Input-dependent Fidelity}, author = {Fan, Mingzhou and Yoon, Byung-Jun and Dougherty, Edward and Urban, Nathan and Alexander, Francis and Arr\'oyave, Raymundo and Qian, Xiaoning}, booktitle = {Proceedings of the Fortieth Conference on Uncertainty in Artificial Intelligence}, pages = {1271--1293}, year = {2024}, editor = {Kiyavash, Negar and Mooij, Joris M.}, volume = {244}, series = {Proceedings of Machine Learning Research}, month = {15--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v244/main/assets/fan24a/fan24a.pdf}, url = {https://proceedings.mlr.press/v244/fan24a.html}, abstract = {By querying approximate surrogate models of different fidelity as available information sources, Multi-Fidelity Bayesian Optimization (MFBO) aims at optimizing unknown functions that are costly if not infeasible to evaluate. Existing MFBO methods often assume that approximate surrogates have consistently high/low fidelity across the input domain. However, approximate evaluations from the same surrogate can have different fidelity at different input regions due to data availability and model constraints, especially when considering machine learning surrogates. In this work, we investigate MFBO when multi-fidelity approximations have input-dependent fidelity. By explicitly capturing input dependency for multi-fidelity queries in Gaussian Process (GP), our new input-dependent MFBO (iMFBO) with learnable noise models better captures the fidelity of each information source in an intuitive way. We further design a new acquisition function for iMFBO and prove that the queries selected by iMFBO have higher quality than those by naive MFBO methods, with the derived sub-linear regret bound. Experiments on both synthetic and real-world data demonstrate its superior empirical performance.} }
Endnote
%0 Conference Paper %T Multi-fidelity Bayesian Optimization with Multiple Information Sources of Input-dependent Fidelity %A Mingzhou Fan %A Byung-Jun Yoon %A Edward Dougherty %A Nathan Urban %A Francis Alexander %A Raymundo Arróyave %A Xiaoning Qian %B Proceedings of the Fortieth Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2024 %E Negar Kiyavash %E Joris M. Mooij %F pmlr-v244-fan24a %I PMLR %P 1271--1293 %U https://proceedings.mlr.press/v244/fan24a.html %V 244 %X By querying approximate surrogate models of different fidelity as available information sources, Multi-Fidelity Bayesian Optimization (MFBO) aims at optimizing unknown functions that are costly if not infeasible to evaluate. Existing MFBO methods often assume that approximate surrogates have consistently high/low fidelity across the input domain. However, approximate evaluations from the same surrogate can have different fidelity at different input regions due to data availability and model constraints, especially when considering machine learning surrogates. In this work, we investigate MFBO when multi-fidelity approximations have input-dependent fidelity. By explicitly capturing input dependency for multi-fidelity queries in Gaussian Process (GP), our new input-dependent MFBO (iMFBO) with learnable noise models better captures the fidelity of each information source in an intuitive way. We further design a new acquisition function for iMFBO and prove that the queries selected by iMFBO have higher quality than those by naive MFBO methods, with the derived sub-linear regret bound. Experiments on both synthetic and real-world data demonstrate its superior empirical performance.
APA
Fan, M., Yoon, B., Dougherty, E., Urban, N., Alexander, F., Arróyave, R. & Qian, X.. (2024). Multi-fidelity Bayesian Optimization with Multiple Information Sources of Input-dependent Fidelity. Proceedings of the Fortieth Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 244:1271-1293 Available from https://proceedings.mlr.press/v244/fan24a.html.

Related Material