High Dimensional Bayesian Optimization using Lasso Variable Selection

Vu Viet Hoang, Hung The Tran, Sunil Gupta, Vu Nguyen
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:3106-3114, 2025.

Abstract

Bayesian optimization (BO) is a leading method for optimizing expensive black-box optimization and has been successfully applied across various scenarios. However, BO suffers from the curse of dimensionality, making it challenging to scale to high-dimensional problems. Existing work has adopted a variable selection strategy to select and optimize only a subset of variables iteratively. Although this approach can mitigate the high-dimensional challenge in BO, it still leads to sample inefficiency. To address this issue, we introduce a novel method that identifies important variables by estimating the length scales of Gaussian process kernels. Next, we construct an effective search region consisting of multiple subspaces and optimize the acquisition function within this region, focusing on only the important variables. We demonstrate that our proposed method achieves cumulative regret with a sublinear growth rate in the worst case while maintaining computational efficiency. Experiments on high-dimensional synthetic functions and real-world problems show that our method achieves state-of-the-art performance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-hoang25a, title = {High Dimensional Bayesian Optimization using Lasso Variable Selection}, author = {Hoang, Vu Viet and Tran, Hung The and Gupta, Sunil and Nguyen, Vu}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {3106--3114}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/hoang25a/hoang25a.pdf}, url = {https://proceedings.mlr.press/v258/hoang25a.html}, abstract = {Bayesian optimization (BO) is a leading method for optimizing expensive black-box optimization and has been successfully applied across various scenarios. However, BO suffers from the curse of dimensionality, making it challenging to scale to high-dimensional problems. Existing work has adopted a variable selection strategy to select and optimize only a subset of variables iteratively. Although this approach can mitigate the high-dimensional challenge in BO, it still leads to sample inefficiency. To address this issue, we introduce a novel method that identifies important variables by estimating the length scales of Gaussian process kernels. Next, we construct an effective search region consisting of multiple subspaces and optimize the acquisition function within this region, focusing on only the important variables. We demonstrate that our proposed method achieves cumulative regret with a sublinear growth rate in the worst case while maintaining computational efficiency. Experiments on high-dimensional synthetic functions and real-world problems show that our method achieves state-of-the-art performance.} }
Endnote
%0 Conference Paper %T High Dimensional Bayesian Optimization using Lasso Variable Selection %A Vu Viet Hoang %A Hung The Tran %A Sunil Gupta %A Vu Nguyen %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-hoang25a %I PMLR %P 3106--3114 %U https://proceedings.mlr.press/v258/hoang25a.html %V 258 %X Bayesian optimization (BO) is a leading method for optimizing expensive black-box optimization and has been successfully applied across various scenarios. However, BO suffers from the curse of dimensionality, making it challenging to scale to high-dimensional problems. Existing work has adopted a variable selection strategy to select and optimize only a subset of variables iteratively. Although this approach can mitigate the high-dimensional challenge in BO, it still leads to sample inefficiency. To address this issue, we introduce a novel method that identifies important variables by estimating the length scales of Gaussian process kernels. Next, we construct an effective search region consisting of multiple subspaces and optimize the acquisition function within this region, focusing on only the important variables. We demonstrate that our proposed method achieves cumulative regret with a sublinear growth rate in the worst case while maintaining computational efficiency. Experiments on high-dimensional synthetic functions and real-world problems show that our method achieves state-of-the-art performance.
APA
Hoang, V.V., Tran, H.T., Gupta, S. & Nguyen, V.. (2025). High Dimensional Bayesian Optimization using Lasso Variable Selection. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:3106-3114 Available from https://proceedings.mlr.press/v258/hoang25a.html.

Related Material