Multi-fidelity Bayesian Optimization with Max-value Entropy Search and its Parallelization

Shion Takeno, Hitoshi Fukuoka, Yuhki Tsukada, Toshiyuki Koyama, Motoki Shiga, Ichiro Takeuchi, Masayuki Karasuyama
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:9334-9345, 2020.

Abstract

In a standard setting of Bayesian optimization (BO), the objective function evaluation is assumed to be highly expensive. Multi-fidelity Bayesian optimization (MFBO) accelerates BO by incorporating lower fidelity observations available with a lower sampling cost. We propose a novel information-theoretic approach to MFBO, called multi-fidelity max-value entropy search (MF-MES), that enables us to obtain a more reliable evaluation of the information gain compared with existing information-based methods for MFBO. Further, we also propose a parallelization of MF-MES mainly for the asynchronous setting because queries typically occur asynchronously in MFBO due to a variety of sampling costs. We show that most of computations in our acquisition functions can be derived analytically, except for at most only two dimensional numerical integration that can be performed efficiently by simple approximations. We demonstrate effectiveness of our approach by using benchmark datasets and a real-world application to materials science data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-takeno20a, title = {Multi-fidelity {B}ayesian Optimization with Max-value Entropy Search and its Parallelization}, author = {Takeno, Shion and Fukuoka, Hitoshi and Tsukada, Yuhki and Koyama, Toshiyuki and Shiga, Motoki and Takeuchi, Ichiro and Karasuyama, Masayuki}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {9334--9345}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/takeno20a/takeno20a.pdf}, url = {https://proceedings.mlr.press/v119/takeno20a.html}, abstract = {In a standard setting of Bayesian optimization (BO), the objective function evaluation is assumed to be highly expensive. Multi-fidelity Bayesian optimization (MFBO) accelerates BO by incorporating lower fidelity observations available with a lower sampling cost. We propose a novel information-theoretic approach to MFBO, called multi-fidelity max-value entropy search (MF-MES), that enables us to obtain a more reliable evaluation of the information gain compared with existing information-based methods for MFBO. Further, we also propose a parallelization of MF-MES mainly for the asynchronous setting because queries typically occur asynchronously in MFBO due to a variety of sampling costs. We show that most of computations in our acquisition functions can be derived analytically, except for at most only two dimensional numerical integration that can be performed efficiently by simple approximations. We demonstrate effectiveness of our approach by using benchmark datasets and a real-world application to materials science data.} }
Endnote
%0 Conference Paper %T Multi-fidelity Bayesian Optimization with Max-value Entropy Search and its Parallelization %A Shion Takeno %A Hitoshi Fukuoka %A Yuhki Tsukada %A Toshiyuki Koyama %A Motoki Shiga %A Ichiro Takeuchi %A Masayuki Karasuyama %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-takeno20a %I PMLR %P 9334--9345 %U https://proceedings.mlr.press/v119/takeno20a.html %V 119 %X In a standard setting of Bayesian optimization (BO), the objective function evaluation is assumed to be highly expensive. Multi-fidelity Bayesian optimization (MFBO) accelerates BO by incorporating lower fidelity observations available with a lower sampling cost. We propose a novel information-theoretic approach to MFBO, called multi-fidelity max-value entropy search (MF-MES), that enables us to obtain a more reliable evaluation of the information gain compared with existing information-based methods for MFBO. Further, we also propose a parallelization of MF-MES mainly for the asynchronous setting because queries typically occur asynchronously in MFBO due to a variety of sampling costs. We show that most of computations in our acquisition functions can be derived analytically, except for at most only two dimensional numerical integration that can be performed efficiently by simple approximations. We demonstrate effectiveness of our approach by using benchmark datasets and a real-world application to materials science data.
APA
Takeno, S., Fukuoka, H., Tsukada, Y., Koyama, T., Shiga, M., Takeuchi, I. & Karasuyama, M.. (2020). Multi-fidelity Bayesian Optimization with Max-value Entropy Search and its Parallelization. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:9334-9345 Available from https://proceedings.mlr.press/v119/takeno20a.html.

Related Material