High-Dimensional Bayesian Optimization via Additive Models with Overlapping Groups

Paul Rolland, Jonathan Scarlett, Ilija Bogunovic, Volkan Cevher
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:298-307, 2018.

Abstract

Bayesian optimization (BO) is a popular technique for sequential black-box function optimization, with applications including parameter tuning, robotics, environmental monitoring, and more. One of the most important challenges in BO is the development of algorithms that scale to high dimensions, which remains a key open problem despite recent progress. In this paper, we consider the approach of Kandasamy et al. (2015), in which the high-dimensional function decomposes as a sum of lower-dimensional functions on subsets of the underlying variables. In particular, we significantly generalize this approach by lifting the assumption that the subsets are disjoint, and consider additive models with arbitrary overlap among the subsets. By representing the dependencies via a graph, we deduce an efficient message passing algorithm for optimizing the acquisition function. In addition, we provide an algorithm for learning the graph from samples based on Gibbs sampling. We empirically demonstrate the effectiveness of our methods on both synthetic and real-world data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v84-rolland18a, title = {High-Dimensional Bayesian Optimization via Additive Models with Overlapping Groups}, author = {Rolland, Paul and Scarlett, Jonathan and Bogunovic, Ilija and Cevher, Volkan}, booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics}, pages = {298--307}, year = {2018}, editor = {Storkey, Amos and Perez-Cruz, Fernando}, volume = {84}, series = {Proceedings of Machine Learning Research}, month = {09--11 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v84/rolland18a/rolland18a.pdf}, url = {https://proceedings.mlr.press/v84/rolland18a.html}, abstract = {Bayesian optimization (BO) is a popular technique for sequential black-box function optimization, with applications including parameter tuning, robotics, environmental monitoring, and more. One of the most important challenges in BO is the development of algorithms that scale to high dimensions, which remains a key open problem despite recent progress. In this paper, we consider the approach of Kandasamy et al. (2015), in which the high-dimensional function decomposes as a sum of lower-dimensional functions on subsets of the underlying variables. In particular, we significantly generalize this approach by lifting the assumption that the subsets are disjoint, and consider additive models with arbitrary overlap among the subsets. By representing the dependencies via a graph, we deduce an efficient message passing algorithm for optimizing the acquisition function. In addition, we provide an algorithm for learning the graph from samples based on Gibbs sampling. We empirically demonstrate the effectiveness of our methods on both synthetic and real-world data.} }
Endnote
%0 Conference Paper %T High-Dimensional Bayesian Optimization via Additive Models with Overlapping Groups %A Paul Rolland %A Jonathan Scarlett %A Ilija Bogunovic %A Volkan Cevher %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2018 %E Amos Storkey %E Fernando Perez-Cruz %F pmlr-v84-rolland18a %I PMLR %P 298--307 %U https://proceedings.mlr.press/v84/rolland18a.html %V 84 %X Bayesian optimization (BO) is a popular technique for sequential black-box function optimization, with applications including parameter tuning, robotics, environmental monitoring, and more. One of the most important challenges in BO is the development of algorithms that scale to high dimensions, which remains a key open problem despite recent progress. In this paper, we consider the approach of Kandasamy et al. (2015), in which the high-dimensional function decomposes as a sum of lower-dimensional functions on subsets of the underlying variables. In particular, we significantly generalize this approach by lifting the assumption that the subsets are disjoint, and consider additive models with arbitrary overlap among the subsets. By representing the dependencies via a graph, we deduce an efficient message passing algorithm for optimizing the acquisition function. In addition, we provide an algorithm for learning the graph from samples based on Gibbs sampling. We empirically demonstrate the effectiveness of our methods on both synthetic and real-world data.
APA
Rolland, P., Scarlett, J., Bogunovic, I. & Cevher, V.. (2018). High-Dimensional Bayesian Optimization via Additive Models with Overlapping Groups. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 84:298-307 Available from https://proceedings.mlr.press/v84/rolland18a.html.

Related Material