Bayesian Optimization with Tree-structured Dependencies

Rodolphe Jenatton, Cedric Archambeau, Javier González, Matthias Seeger
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:1655-1664, 2017.

Abstract

Bayesian optimization has been successfully used to optimize complex black-box functions whose evaluations are expensive. In many applications, like in deep learning and predictive analytics, the optimization domain is itself complex and structured. In this work, we focus on use cases where this domain exhibits a known dependency structure. The benefit of leveraging this structure is twofold: we explore the search space more efficiently and posterior inference scales more favorably with the number of observations than Gaussian Process-based approaches published in the literature. We introduce a novel surrogate model for Bayesian optimization which combines independent Gaussian Processes with a linear model that encodes a tree-based dependency structure and can transfer information between overlapping decision sequences. We also design a specialized two-step acquisition function that explores the search space more effectively. Our experiments on synthetic tree-structured functions and the tuning of feedforward neural networks trained on a range of binary classification datasets show that our method compares favorably with competing approaches.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-jenatton17a, title = {{B}ayesian Optimization with Tree-structured Dependencies}, author = {Rodolphe Jenatton and Cedric Archambeau and Javier Gonz{\'a}lez and Matthias Seeger}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {1655--1664}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/jenatton17a/jenatton17a.pdf}, url = {https://proceedings.mlr.press/v70/jenatton17a.html}, abstract = {Bayesian optimization has been successfully used to optimize complex black-box functions whose evaluations are expensive. In many applications, like in deep learning and predictive analytics, the optimization domain is itself complex and structured. In this work, we focus on use cases where this domain exhibits a known dependency structure. The benefit of leveraging this structure is twofold: we explore the search space more efficiently and posterior inference scales more favorably with the number of observations than Gaussian Process-based approaches published in the literature. We introduce a novel surrogate model for Bayesian optimization which combines independent Gaussian Processes with a linear model that encodes a tree-based dependency structure and can transfer information between overlapping decision sequences. We also design a specialized two-step acquisition function that explores the search space more effectively. Our experiments on synthetic tree-structured functions and the tuning of feedforward neural networks trained on a range of binary classification datasets show that our method compares favorably with competing approaches.} }
Endnote
%0 Conference Paper %T Bayesian Optimization with Tree-structured Dependencies %A Rodolphe Jenatton %A Cedric Archambeau %A Javier González %A Matthias Seeger %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-jenatton17a %I PMLR %P 1655--1664 %U https://proceedings.mlr.press/v70/jenatton17a.html %V 70 %X Bayesian optimization has been successfully used to optimize complex black-box functions whose evaluations are expensive. In many applications, like in deep learning and predictive analytics, the optimization domain is itself complex and structured. In this work, we focus on use cases where this domain exhibits a known dependency structure. The benefit of leveraging this structure is twofold: we explore the search space more efficiently and posterior inference scales more favorably with the number of observations than Gaussian Process-based approaches published in the literature. We introduce a novel surrogate model for Bayesian optimization which combines independent Gaussian Processes with a linear model that encodes a tree-based dependency structure and can transfer information between overlapping decision sequences. We also design a specialized two-step acquisition function that explores the search space more effectively. Our experiments on synthetic tree-structured functions and the tuning of feedforward neural networks trained on a range of binary classification datasets show that our method compares favorably with competing approaches.
APA
Jenatton, R., Archambeau, C., González, J. & Seeger, M.. (2017). Bayesian Optimization with Tree-structured Dependencies. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:1655-1664 Available from https://proceedings.mlr.press/v70/jenatton17a.html.

Related Material