Optimization, fast and slow: optimally switching between local and Bayesian optimization

Mark McLeod, Stephen Roberts, Michael A. Osborne
; Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3443-3452, 2018.

Abstract

We develop the first Bayesian Optimization algorithm, BLOSSOM, which selects between multiple alternative acquisition functions and traditional local optimization at each step. This is combined with a novel stopping condition based on expected regret. This pairing allows us to obtain the best characteristics of both local and Bayesian optimization, making efficient use of function evaluations while yielding superior convergence to the global minimum on a selection of optimization problems, and also halting optimization once a principled and intuitive stopping condition has been fulfilled.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-mcleod18a, title = {Optimization, fast and slow: optimally switching between local and {B}ayesian optimization}, author = {McLeod, Mark and Roberts, Stephen and Osborne, Michael A.}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3443--3452}, year = {2018}, editor = {Jennifer Dy and Andreas Krause}, volume = {80}, series = {Proceedings of Machine Learning Research}, address = {Stockholmsmässan, Stockholm Sweden}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/mcleod18a/mcleod18a.pdf}, url = {http://proceedings.mlr.press/v80/mcleod18a.html}, abstract = {We develop the first Bayesian Optimization algorithm, BLOSSOM, which selects between multiple alternative acquisition functions and traditional local optimization at each step. This is combined with a novel stopping condition based on expected regret. This pairing allows us to obtain the best characteristics of both local and Bayesian optimization, making efficient use of function evaluations while yielding superior convergence to the global minimum on a selection of optimization problems, and also halting optimization once a principled and intuitive stopping condition has been fulfilled.} }
Endnote
%0 Conference Paper %T Optimization, fast and slow: optimally switching between local and Bayesian optimization %A Mark McLeod %A Stephen Roberts %A Michael A. Osborne %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-mcleod18a %I PMLR %J Proceedings of Machine Learning Research %P 3443--3452 %U http://proceedings.mlr.press %V 80 %W PMLR %X We develop the first Bayesian Optimization algorithm, BLOSSOM, which selects between multiple alternative acquisition functions and traditional local optimization at each step. This is combined with a novel stopping condition based on expected regret. This pairing allows us to obtain the best characteristics of both local and Bayesian optimization, making efficient use of function evaluations while yielding superior convergence to the global minimum on a selection of optimization problems, and also halting optimization once a principled and intuitive stopping condition has been fulfilled.
APA
McLeod, M., Roberts, S. & Osborne, M.A.. (2018). Optimization, fast and slow: optimally switching between local and Bayesian optimization. Proceedings of the 35th International Conference on Machine Learning, in PMLR 80:3443-3452

Related Material