Optimization, fast and slow: optimally switching between local and Bayesian optimization

Mark McLeod, Stephen Roberts, Michael A. Osborne
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3443-3452, 2018.

Abstract

We develop the first Bayesian Optimization algorithm, BLOSSOM, which selects between multiple alternative acquisition functions and traditional local optimization at each step. This is combined with a novel stopping condition based on expected regret. This pairing allows us to obtain the best characteristics of both local and Bayesian optimization, making efficient use of function evaluations while yielding superior convergence to the global minimum on a selection of optimization problems, and also halting optimization once a principled and intuitive stopping condition has been fulfilled.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-mcleod18a, title = {Optimization, fast and slow: optimally switching between local and {B}ayesian optimization}, author = {McLeod, Mark and Roberts, Stephen and Osborne, Michael A.}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3443--3452}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/mcleod18a/mcleod18a.pdf}, url = {https://proceedings.mlr.press/v80/mcleod18a.html}, abstract = {We develop the first Bayesian Optimization algorithm, BLOSSOM, which selects between multiple alternative acquisition functions and traditional local optimization at each step. This is combined with a novel stopping condition based on expected regret. This pairing allows us to obtain the best characteristics of both local and Bayesian optimization, making efficient use of function evaluations while yielding superior convergence to the global minimum on a selection of optimization problems, and also halting optimization once a principled and intuitive stopping condition has been fulfilled.} }
Endnote
%0 Conference Paper %T Optimization, fast and slow: optimally switching between local and Bayesian optimization %A Mark McLeod %A Stephen Roberts %A Michael A. Osborne %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-mcleod18a %I PMLR %P 3443--3452 %U https://proceedings.mlr.press/v80/mcleod18a.html %V 80 %X We develop the first Bayesian Optimization algorithm, BLOSSOM, which selects between multiple alternative acquisition functions and traditional local optimization at each step. This is combined with a novel stopping condition based on expected regret. This pairing allows us to obtain the best characteristics of both local and Bayesian optimization, making efficient use of function evaluations while yielding superior convergence to the global minimum on a selection of optimization problems, and also halting optimization once a principled and intuitive stopping condition has been fulfilled.
APA
McLeod, M., Roberts, S. & Osborne, M.A.. (2018). Optimization, fast and slow: optimally switching between local and Bayesian optimization. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:3443-3452 Available from https://proceedings.mlr.press/v80/mcleod18a.html.

Related Material