Quantile Stein Variational Gradient Descent for Batch Bayesian Optimization

Chengyue Gong, Jian Peng, Qiang Liu
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:2347-2356, 2019.

Abstract

Batch Bayesian optimization has been shown to be an efficient and successful approach for black-box function optimization, especially when the evaluation of cost function is highly expensive but can be efficiently parallelized. In this paper, we introduce a novel variational framework for batch query optimization, based on the argument that the query batch should be selected to have both high diversity and good worst case performance. This motivates us to introduce a variational objective that combines a quantile-based risk measure (for worst case performance) and entropy regularization (for enforcing diversity). We derive a gradient-based particle-based algorithm for solving our quantile-based variational objective, which generalizes Stein variational gradient descent (SVGD). We evaluate our method on a number of real-world applications and show that it consistently outperforms other recent state-of-the-art batch Bayesian optimization methods. Extensive experimental results indicate that our method achieves better or comparable performance, compared to the existing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-gong19b, title = {Quantile Stein Variational Gradient Descent for Batch {B}ayesian Optimization}, author = {Gong, Chengyue and Peng, Jian and Liu, Qiang}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {2347--2356}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/gong19b/gong19b.pdf}, url = {https://proceedings.mlr.press/v97/gong19b.html}, abstract = {Batch Bayesian optimization has been shown to be an efficient and successful approach for black-box function optimization, especially when the evaluation of cost function is highly expensive but can be efficiently parallelized. In this paper, we introduce a novel variational framework for batch query optimization, based on the argument that the query batch should be selected to have both high diversity and good worst case performance. This motivates us to introduce a variational objective that combines a quantile-based risk measure (for worst case performance) and entropy regularization (for enforcing diversity). We derive a gradient-based particle-based algorithm for solving our quantile-based variational objective, which generalizes Stein variational gradient descent (SVGD). We evaluate our method on a number of real-world applications and show that it consistently outperforms other recent state-of-the-art batch Bayesian optimization methods. Extensive experimental results indicate that our method achieves better or comparable performance, compared to the existing methods.} }
Endnote
%0 Conference Paper %T Quantile Stein Variational Gradient Descent for Batch Bayesian Optimization %A Chengyue Gong %A Jian Peng %A Qiang Liu %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-gong19b %I PMLR %P 2347--2356 %U https://proceedings.mlr.press/v97/gong19b.html %V 97 %X Batch Bayesian optimization has been shown to be an efficient and successful approach for black-box function optimization, especially when the evaluation of cost function is highly expensive but can be efficiently parallelized. In this paper, we introduce a novel variational framework for batch query optimization, based on the argument that the query batch should be selected to have both high diversity and good worst case performance. This motivates us to introduce a variational objective that combines a quantile-based risk measure (for worst case performance) and entropy regularization (for enforcing diversity). We derive a gradient-based particle-based algorithm for solving our quantile-based variational objective, which generalizes Stein variational gradient descent (SVGD). We evaluate our method on a number of real-world applications and show that it consistently outperforms other recent state-of-the-art batch Bayesian optimization methods. Extensive experimental results indicate that our method achieves better or comparable performance, compared to the existing methods.
APA
Gong, C., Peng, J. & Liu, Q.. (2019). Quantile Stein Variational Gradient Descent for Batch Bayesian Optimization. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:2347-2356 Available from https://proceedings.mlr.press/v97/gong19b.html.

Related Material