Robust Multi-fidelity Bayesian Optimization with Deep Kernel and Partition

Fengxue Zhang, Thomas Desautels, Yuxin Chen
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:2683-2691, 2025.

Abstract

Multi-fidelity Bayesian optimization (MFBO) is a powerful approach that utilizes low-fidelity, cost-effective sources to expedite the exploration and exploitation of a high-fidelity objective function. Existing MFBO methods with theoretical foundations either lack justification for performance improvements over single-fidelity optimization or rely on strong assumptions about the relationships between fidelity sources to construct surrogate models and direct queries to low-fidelity sources. To mitigate the dependency on cross-fidelity assumptions while maintaining the advantages of low-fidelity queries, we introduce a random sampling and partition-based MFBO framework with deep kernel learning. This framework is robust to cross-fidelity model misspecification and explicitly illustrates the benefits of low-fidelity queries. Our results demonstrate that the proposed algorithm effectively manages complex cross-fidelity relationships and efficiently optimizes the target fidelity function.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-zhang25g, title = {Robust Multi-fidelity Bayesian Optimization with Deep Kernel and Partition}, author = {Zhang, Fengxue and Desautels, Thomas and Chen, Yuxin}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {2683--2691}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/zhang25g/zhang25g.pdf}, url = {https://proceedings.mlr.press/v258/zhang25g.html}, abstract = {Multi-fidelity Bayesian optimization (MFBO) is a powerful approach that utilizes low-fidelity, cost-effective sources to expedite the exploration and exploitation of a high-fidelity objective function. Existing MFBO methods with theoretical foundations either lack justification for performance improvements over single-fidelity optimization or rely on strong assumptions about the relationships between fidelity sources to construct surrogate models and direct queries to low-fidelity sources. To mitigate the dependency on cross-fidelity assumptions while maintaining the advantages of low-fidelity queries, we introduce a random sampling and partition-based MFBO framework with deep kernel learning. This framework is robust to cross-fidelity model misspecification and explicitly illustrates the benefits of low-fidelity queries. Our results demonstrate that the proposed algorithm effectively manages complex cross-fidelity relationships and efficiently optimizes the target fidelity function.} }
Endnote
%0 Conference Paper %T Robust Multi-fidelity Bayesian Optimization with Deep Kernel and Partition %A Fengxue Zhang %A Thomas Desautels %A Yuxin Chen %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-zhang25g %I PMLR %P 2683--2691 %U https://proceedings.mlr.press/v258/zhang25g.html %V 258 %X Multi-fidelity Bayesian optimization (MFBO) is a powerful approach that utilizes low-fidelity, cost-effective sources to expedite the exploration and exploitation of a high-fidelity objective function. Existing MFBO methods with theoretical foundations either lack justification for performance improvements over single-fidelity optimization or rely on strong assumptions about the relationships between fidelity sources to construct surrogate models and direct queries to low-fidelity sources. To mitigate the dependency on cross-fidelity assumptions while maintaining the advantages of low-fidelity queries, we introduce a random sampling and partition-based MFBO framework with deep kernel learning. This framework is robust to cross-fidelity model misspecification and explicitly illustrates the benefits of low-fidelity queries. Our results demonstrate that the proposed algorithm effectively manages complex cross-fidelity relationships and efficiently optimizes the target fidelity function.
APA
Zhang, F., Desautels, T. & Chen, Y.. (2025). Robust Multi-fidelity Bayesian Optimization with Deep Kernel and Partition. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:2683-2691 Available from https://proceedings.mlr.press/v258/zhang25g.html.

Related Material