Optimizing Dynamic Structures with Bayesian Generative Search

Minh Hoang, Carleton Kingsford
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:4271-4281, 2020.

Abstract

Kernel selection for kernel-based methods is prohibitively expensive due to the NP-hard nature of discrete optimization. Since gradient-based optimizers are not applicable due to the lack of a differentiable objective function, many state-of-the-art solutions resort to heuristic search or gradient-free optimization. These approaches, however, require imposing restrictive assumptions on the explorable space of structures such as limiting the active candidate pool, thus depending heavily on the intuition of domain experts. This paper instead proposes \textbf{DTERGENS}, a novel generative search framework that constructs and optimizes a high-performance composite kernel expressions generator. \textbf{DTERGENS} does not restrict the space of candidate kernels and is capable of obtaining flexible length expressions by jointly optimizing a generative termination criterion. We demonstrate that our framework explores more diverse kernels and obtains better performance than state-of-the-art approaches on many real-world predictive tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-hoang20a, title = {Optimizing Dynamic Structures with {B}ayesian Generative Search}, author = {Hoang, Minh and Kingsford, Carleton}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {4271--4281}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/hoang20a/hoang20a.pdf}, url = {https://proceedings.mlr.press/v119/hoang20a.html}, abstract = {Kernel selection for kernel-based methods is prohibitively expensive due to the NP-hard nature of discrete optimization. Since gradient-based optimizers are not applicable due to the lack of a differentiable objective function, many state-of-the-art solutions resort to heuristic search or gradient-free optimization. These approaches, however, require imposing restrictive assumptions on the explorable space of structures such as limiting the active candidate pool, thus depending heavily on the intuition of domain experts. This paper instead proposes \textbf{DTERGENS}, a novel generative search framework that constructs and optimizes a high-performance composite kernel expressions generator. \textbf{DTERGENS} does not restrict the space of candidate kernels and is capable of obtaining flexible length expressions by jointly optimizing a generative termination criterion. We demonstrate that our framework explores more diverse kernels and obtains better performance than state-of-the-art approaches on many real-world predictive tasks.} }
Endnote
%0 Conference Paper %T Optimizing Dynamic Structures with Bayesian Generative Search %A Minh Hoang %A Carleton Kingsford %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-hoang20a %I PMLR %P 4271--4281 %U https://proceedings.mlr.press/v119/hoang20a.html %V 119 %X Kernel selection for kernel-based methods is prohibitively expensive due to the NP-hard nature of discrete optimization. Since gradient-based optimizers are not applicable due to the lack of a differentiable objective function, many state-of-the-art solutions resort to heuristic search or gradient-free optimization. These approaches, however, require imposing restrictive assumptions on the explorable space of structures such as limiting the active candidate pool, thus depending heavily on the intuition of domain experts. This paper instead proposes \textbf{DTERGENS}, a novel generative search framework that constructs and optimizes a high-performance composite kernel expressions generator. \textbf{DTERGENS} does not restrict the space of candidate kernels and is capable of obtaining flexible length expressions by jointly optimizing a generative termination criterion. We demonstrate that our framework explores more diverse kernels and obtains better performance than state-of-the-art approaches on many real-world predictive tasks.
APA
Hoang, M. & Kingsford, C.. (2020). Optimizing Dynamic Structures with Bayesian Generative Search. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:4271-4281 Available from https://proceedings.mlr.press/v119/hoang20a.html.

Related Material