A Framework for Bayesian Optimization in Embedded Subspaces

Amin Nayebi, Alexander Munteanu, Matthias Poloczek
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:4752-4761, 2019.

Abstract

We present a theoretically founded approach for high-dimensional Bayesian optimization based on low-dimensional subspace embeddings. We prove that the error in the Gaussian process model is bounded tightly when going from the original high-dimensional search domain to the low-dimensional embedding. This implies that the optimization process in the low-dimensional embedding proceeds essentially as if it were run directly on an unknown active subspace of low dimensionality. The argument applies to a large class of algorithms and GP models, including non-stationary kernels. Moreover, we provide an efficient implementation based on hashing and demonstrate empirically that this subspace embedding achieves considerably better results than the previously proposed methods for high-dimensional BO based on Gaussian matrix projections and structure-learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-nayebi19a, title = {A Framework for {B}ayesian Optimization in Embedded Subspaces}, author = {Nayebi, Amin and Munteanu, Alexander and Poloczek, Matthias}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {4752--4761}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/nayebi19a/nayebi19a.pdf}, url = {https://proceedings.mlr.press/v97/nayebi19a.html}, abstract = {We present a theoretically founded approach for high-dimensional Bayesian optimization based on low-dimensional subspace embeddings. We prove that the error in the Gaussian process model is bounded tightly when going from the original high-dimensional search domain to the low-dimensional embedding. This implies that the optimization process in the low-dimensional embedding proceeds essentially as if it were run directly on an unknown active subspace of low dimensionality. The argument applies to a large class of algorithms and GP models, including non-stationary kernels. Moreover, we provide an efficient implementation based on hashing and demonstrate empirically that this subspace embedding achieves considerably better results than the previously proposed methods for high-dimensional BO based on Gaussian matrix projections and structure-learning.} }
Endnote
%0 Conference Paper %T A Framework for Bayesian Optimization in Embedded Subspaces %A Amin Nayebi %A Alexander Munteanu %A Matthias Poloczek %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-nayebi19a %I PMLR %P 4752--4761 %U https://proceedings.mlr.press/v97/nayebi19a.html %V 97 %X We present a theoretically founded approach for high-dimensional Bayesian optimization based on low-dimensional subspace embeddings. We prove that the error in the Gaussian process model is bounded tightly when going from the original high-dimensional search domain to the low-dimensional embedding. This implies that the optimization process in the low-dimensional embedding proceeds essentially as if it were run directly on an unknown active subspace of low dimensionality. The argument applies to a large class of algorithms and GP models, including non-stationary kernels. Moreover, we provide an efficient implementation based on hashing and demonstrate empirically that this subspace embedding achieves considerably better results than the previously proposed methods for high-dimensional BO based on Gaussian matrix projections and structure-learning.
APA
Nayebi, A., Munteanu, A. & Poloczek, M.. (2019). A Framework for Bayesian Optimization in Embedded Subspaces. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:4752-4761 Available from https://proceedings.mlr.press/v97/nayebi19a.html.

Related Material