Global optimization of Lipschitz functions

Cédric Malherbe, Nicolas Vayatis
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:2314-2323, 2017.

Abstract

The goal of the paper is to design sequential strategies which lead to efficient optimization of an unknown function under the only assumption that it has a finite Lipschitz constant. We first identify sufficient conditions for the consistency of generic sequential algorithms and formulate the expected minimax rate for their performance. We introduce and analyze a first algorithm called LIPO which assumes the Lipschitz constant to be known. Consistency, minimax rates for LIPO are proved, as well as fast rates under an additional Hölder like condition. An adaptive version of LIPO is also introduced for the more realistic setup where Lipschitz constant is unknown and has to be estimated along with the optimization. Similar theoretical guarantees are shown to hold for the adaptive LIPO algorithm and a numerical assessment is provided at the end of the paper to illustrate the potential of this strategy with respect to state-of-the-art methods over typical benchmark problems for global optimization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-malherbe17a, title = {Global optimization of {L}ipschitz functions}, author = {C{\'e}dric Malherbe and Nicolas Vayatis}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {2314--2323}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/malherbe17a/malherbe17a.pdf}, url = {https://proceedings.mlr.press/v70/malherbe17a.html}, abstract = {The goal of the paper is to design sequential strategies which lead to efficient optimization of an unknown function under the only assumption that it has a finite Lipschitz constant. We first identify sufficient conditions for the consistency of generic sequential algorithms and formulate the expected minimax rate for their performance. We introduce and analyze a first algorithm called LIPO which assumes the Lipschitz constant to be known. Consistency, minimax rates for LIPO are proved, as well as fast rates under an additional Hölder like condition. An adaptive version of LIPO is also introduced for the more realistic setup where Lipschitz constant is unknown and has to be estimated along with the optimization. Similar theoretical guarantees are shown to hold for the adaptive LIPO algorithm and a numerical assessment is provided at the end of the paper to illustrate the potential of this strategy with respect to state-of-the-art methods over typical benchmark problems for global optimization.} }
Endnote
%0 Conference Paper %T Global optimization of Lipschitz functions %A Cédric Malherbe %A Nicolas Vayatis %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-malherbe17a %I PMLR %P 2314--2323 %U https://proceedings.mlr.press/v70/malherbe17a.html %V 70 %X The goal of the paper is to design sequential strategies which lead to efficient optimization of an unknown function under the only assumption that it has a finite Lipschitz constant. We first identify sufficient conditions for the consistency of generic sequential algorithms and formulate the expected minimax rate for their performance. We introduce and analyze a first algorithm called LIPO which assumes the Lipschitz constant to be known. Consistency, minimax rates for LIPO are proved, as well as fast rates under an additional Hölder like condition. An adaptive version of LIPO is also introduced for the more realistic setup where Lipschitz constant is unknown and has to be estimated along with the optimization. Similar theoretical guarantees are shown to hold for the adaptive LIPO algorithm and a numerical assessment is provided at the end of the paper to illustrate the potential of this strategy with respect to state-of-the-art methods over typical benchmark problems for global optimization.
APA
Malherbe, C. & Vayatis, N.. (2017). Global optimization of Lipschitz functions. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:2314-2323 Available from https://proceedings.mlr.press/v70/malherbe17a.html.

Related Material