Fast Information-theoretic Bayesian Optimisation

Binxin Ru, Michael A. Osborne, Mark Mcleod, Diego Granziol
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:4384-4392, 2018.

Abstract

Information-theoretic Bayesian optimisation techniques have demonstrated state-of-the-art performance in tackling important global optimisation problems. However, current information-theoretic approaches require many approximations in implementation, introduce often-prohibitive computational overhead and limit the choice of kernels available to model the objective. We develop a fast information-theoretic Bayesian Optimisation method, FITBO, that avoids the need for sampling the global minimiser, thus significantly reducing computational overhead. Moreover, in comparison with existing approaches, our method faces fewer constraints on kernel choice and enjoys the merits of dealing with the output space. We demonstrate empirically that FITBO inherits the performance associated with information-theoretic Bayesian optimisation, while being even faster than simpler Bayesian optimisation approaches, such as Expected Improvement.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-ru18a, title = {Fast Information-theoretic {B}ayesian Optimisation}, author = {Ru, Binxin and Osborne, Michael A. and Mcleod, Mark and Granziol, Diego}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {4384--4392}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/ru18a/ru18a.pdf}, url = {http://proceedings.mlr.press/v80/ru18a.html}, abstract = {Information-theoretic Bayesian optimisation techniques have demonstrated state-of-the-art performance in tackling important global optimisation problems. However, current information-theoretic approaches require many approximations in implementation, introduce often-prohibitive computational overhead and limit the choice of kernels available to model the objective. We develop a fast information-theoretic Bayesian Optimisation method, FITBO, that avoids the need for sampling the global minimiser, thus significantly reducing computational overhead. Moreover, in comparison with existing approaches, our method faces fewer constraints on kernel choice and enjoys the merits of dealing with the output space. We demonstrate empirically that FITBO inherits the performance associated with information-theoretic Bayesian optimisation, while being even faster than simpler Bayesian optimisation approaches, such as Expected Improvement.} }
Endnote
%0 Conference Paper %T Fast Information-theoretic Bayesian Optimisation %A Binxin Ru %A Michael A. Osborne %A Mark Mcleod %A Diego Granziol %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-ru18a %I PMLR %P 4384--4392 %U http://proceedings.mlr.press/v80/ru18a.html %V 80 %X Information-theoretic Bayesian optimisation techniques have demonstrated state-of-the-art performance in tackling important global optimisation problems. However, current information-theoretic approaches require many approximations in implementation, introduce often-prohibitive computational overhead and limit the choice of kernels available to model the objective. We develop a fast information-theoretic Bayesian Optimisation method, FITBO, that avoids the need for sampling the global minimiser, thus significantly reducing computational overhead. Moreover, in comparison with existing approaches, our method faces fewer constraints on kernel choice and enjoys the merits of dealing with the output space. We demonstrate empirically that FITBO inherits the performance associated with information-theoretic Bayesian optimisation, while being even faster than simpler Bayesian optimisation approaches, such as Expected Improvement.
APA
Ru, B., Osborne, M.A., Mcleod, M. & Granziol, D.. (2018). Fast Information-theoretic Bayesian Optimisation. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:4384-4392 Available from http://proceedings.mlr.press/v80/ru18a.html.

Related Material