{PF}$^2$ES: Parallel Feasible Pareto Frontier Entropy Search for Multi-Objective Bayesian Optimization

Jixiang Qing, Henry B. Moss, Tom Dhaene, Ivo Couckuyt
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:2565-2588, 2023.

Abstract

We present Parallel Feasible Pareto Frontier Entropy Search ($\{\mathrm{PF}\}^2$ES) — a novel information-theoretic acquisition function for multi-objective Bayesian optimization supporting unknown constraints and batch queries. Due to the complexity of characterizing the mutual information between candidate evaluations and (feasible) Pareto frontiers, existing approaches must either employ crude approximations that significantly hamper their performance or rely on expensive inference schemes that substantially increase the optimization’s computational overhead. By instead using a variational lower bound, $\{\mathrm{PF}\}^2$ES provides a low-cost and accurate estimate of the mutual information. We benchmark $\{\mathrm{PF}\}^2$ES against other information-theoretic acquisition functions, demonstrating its competitive performance for optimization across synthetic and real-world design problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-qing23a, title = {\{PF\}$^2$ES: Parallel Feasible Pareto Frontier Entropy Search for Multi-Objective Bayesian Optimization}, author = {Qing, Jixiang and Moss, Henry B. and Dhaene, Tom and Couckuyt, Ivo}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {2565--2588}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/qing23a/qing23a.pdf}, url = {https://proceedings.mlr.press/v206/qing23a.html}, abstract = {We present Parallel Feasible Pareto Frontier Entropy Search ($\{\mathrm{PF}\}^2$ES) — a novel information-theoretic acquisition function for multi-objective Bayesian optimization supporting unknown constraints and batch queries. Due to the complexity of characterizing the mutual information between candidate evaluations and (feasible) Pareto frontiers, existing approaches must either employ crude approximations that significantly hamper their performance or rely on expensive inference schemes that substantially increase the optimization’s computational overhead. By instead using a variational lower bound, $\{\mathrm{PF}\}^2$ES provides a low-cost and accurate estimate of the mutual information. We benchmark $\{\mathrm{PF}\}^2$ES against other information-theoretic acquisition functions, demonstrating its competitive performance for optimization across synthetic and real-world design problems.} }
Endnote
%0 Conference Paper %T {PF}$^2$ES: Parallel Feasible Pareto Frontier Entropy Search for Multi-Objective Bayesian Optimization %A Jixiang Qing %A Henry B. Moss %A Tom Dhaene %A Ivo Couckuyt %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-qing23a %I PMLR %P 2565--2588 %U https://proceedings.mlr.press/v206/qing23a.html %V 206 %X We present Parallel Feasible Pareto Frontier Entropy Search ($\{\mathrm{PF}\}^2$ES) — a novel information-theoretic acquisition function for multi-objective Bayesian optimization supporting unknown constraints and batch queries. Due to the complexity of characterizing the mutual information between candidate evaluations and (feasible) Pareto frontiers, existing approaches must either employ crude approximations that significantly hamper their performance or rely on expensive inference schemes that substantially increase the optimization’s computational overhead. By instead using a variational lower bound, $\{\mathrm{PF}\}^2$ES provides a low-cost and accurate estimate of the mutual information. We benchmark $\{\mathrm{PF}\}^2$ES against other information-theoretic acquisition functions, demonstrating its competitive performance for optimization across synthetic and real-world design problems.
APA
Qing, J., Moss, H.B., Dhaene, T. & Couckuyt, I.. (2023). {PF}$^2$ES: Parallel Feasible Pareto Frontier Entropy Search for Multi-Objective Bayesian Optimization. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:2565-2588 Available from https://proceedings.mlr.press/v206/qing23a.html.

Related Material