[edit]
Optimization, fast and slow: optimally switching between local and Bayesian optimization
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3443-3452, 2018.
Abstract
We develop the first Bayesian Optimization algorithm, BLOSSOM, which selects between multiple alternative acquisition functions and traditional local optimization at each step. This is combined with a novel stopping condition based on expected regret. This pairing allows us to obtain the best characteristics of both local and Bayesian optimization, making efficient use of function evaluations while yielding superior convergence to the global minimum on a selection of optimization problems, and also halting optimization once a principled and intuitive stopping condition has been fulfilled.