Multi-Fidelity Black-Box Optimization with Hierarchical Partitions

Rajat Sen, Kirthevasan Kandasamy, Sanjay Shakkottai
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:4538-4547, 2018.

Abstract

Motivated by settings such as hyper-parameter tuning and physical simulations, we consider the problem of black-box optimization of a function. Multi-fidelity techniques have become popular for applications where exact function evaluations are expensive, but coarse (biased) approximations are available at much lower cost. A canonical example is that of hyper-parameter selection in a learning algorithm. The learning algorithm can be trained for fewer iterations – this results in a lower cost, but its validation error is only coarsely indicative of the same if the algorithm had been trained till completion. We incorporate the multi-fidelity setup into the powerful framework of black-box optimization through hierarchical partitioning. We develop tree-search based multi-fidelity algorithms with theoretical guarantees on simple regret. We finally demonstrate the performance gains of our algorithms on both real and synthetic datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-sen18a, title = {Multi-Fidelity Black-Box Optimization with Hierarchical Partitions}, author = {Sen, Rajat and Kandasamy, Kirthevasan and Shakkottai, Sanjay}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {4538--4547}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/sen18a/sen18a.pdf}, url = {https://proceedings.mlr.press/v80/sen18a.html}, abstract = {Motivated by settings such as hyper-parameter tuning and physical simulations, we consider the problem of black-box optimization of a function. Multi-fidelity techniques have become popular for applications where exact function evaluations are expensive, but coarse (biased) approximations are available at much lower cost. A canonical example is that of hyper-parameter selection in a learning algorithm. The learning algorithm can be trained for fewer iterations – this results in a lower cost, but its validation error is only coarsely indicative of the same if the algorithm had been trained till completion. We incorporate the multi-fidelity setup into the powerful framework of black-box optimization through hierarchical partitioning. We develop tree-search based multi-fidelity algorithms with theoretical guarantees on simple regret. We finally demonstrate the performance gains of our algorithms on both real and synthetic datasets.} }
Endnote
%0 Conference Paper %T Multi-Fidelity Black-Box Optimization with Hierarchical Partitions %A Rajat Sen %A Kirthevasan Kandasamy %A Sanjay Shakkottai %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-sen18a %I PMLR %P 4538--4547 %U https://proceedings.mlr.press/v80/sen18a.html %V 80 %X Motivated by settings such as hyper-parameter tuning and physical simulations, we consider the problem of black-box optimization of a function. Multi-fidelity techniques have become popular for applications where exact function evaluations are expensive, but coarse (biased) approximations are available at much lower cost. A canonical example is that of hyper-parameter selection in a learning algorithm. The learning algorithm can be trained for fewer iterations – this results in a lower cost, but its validation error is only coarsely indicative of the same if the algorithm had been trained till completion. We incorporate the multi-fidelity setup into the powerful framework of black-box optimization through hierarchical partitioning. We develop tree-search based multi-fidelity algorithms with theoretical guarantees on simple regret. We finally demonstrate the performance gains of our algorithms on both real and synthetic datasets.
APA
Sen, R., Kandasamy, K. & Shakkottai, S.. (2018). Multi-Fidelity Black-Box Optimization with Hierarchical Partitions. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:4538-4547 Available from https://proceedings.mlr.press/v80/sen18a.html.

Related Material