Noisy Blackbox Optimization using Multi-fidelity Queries: A Tree Search Approach

Rajat Sen, Kirthevasan Kandasamy, Sanjay Shakkottai
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:2096-2105, 2019.

Abstract

We study the problem of black-box optimization of a noisy function in the presence of low-cost approximations or fidelities, which is motivated by problems like hyper-parameter tuning. In hyper-parameter tuning evaluating the black-box function at a point involves training a learning algorithm on a large data-set at a particular hyper-parameter and evaluating the validation error. Even a single such evaluation can be prohibitively expensive. Therefore, it is beneficial to use low-cost approximations, like training the learning algorithm on a sub-sampled version of the whole data-set. These low-cost approximations/fidelities can however provide a biased and noisy estimate of the function value. In this work, we combine structured state-space exploration through hierarchical partitioning with querying these partitions at multiple fidelities, and develop a multi-fidelity bandit based tree-search algorithm for noisy black-box optimization. We derive simple regret guarantees for our algorithm and validate its performance on real and synthetic datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-sen19a, title = {Noisy Blackbox Optimization using Multi-fidelity Queries: A Tree Search Approach}, author = {Sen, Rajat and Kandasamy, Kirthevasan and Shakkottai, Sanjay}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {2096--2105}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/sen19a/sen19a.pdf}, url = {https://proceedings.mlr.press/v89/sen19a.html}, abstract = {We study the problem of black-box optimization of a noisy function in the presence of low-cost approximations or fidelities, which is motivated by problems like hyper-parameter tuning. In hyper-parameter tuning evaluating the black-box function at a point involves training a learning algorithm on a large data-set at a particular hyper-parameter and evaluating the validation error. Even a single such evaluation can be prohibitively expensive. Therefore, it is beneficial to use low-cost approximations, like training the learning algorithm on a sub-sampled version of the whole data-set. These low-cost approximations/fidelities can however provide a biased and noisy estimate of the function value. In this work, we combine structured state-space exploration through hierarchical partitioning with querying these partitions at multiple fidelities, and develop a multi-fidelity bandit based tree-search algorithm for noisy black-box optimization. We derive simple regret guarantees for our algorithm and validate its performance on real and synthetic datasets.} }
Endnote
%0 Conference Paper %T Noisy Blackbox Optimization using Multi-fidelity Queries: A Tree Search Approach %A Rajat Sen %A Kirthevasan Kandasamy %A Sanjay Shakkottai %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-sen19a %I PMLR %P 2096--2105 %U https://proceedings.mlr.press/v89/sen19a.html %V 89 %X We study the problem of black-box optimization of a noisy function in the presence of low-cost approximations or fidelities, which is motivated by problems like hyper-parameter tuning. In hyper-parameter tuning evaluating the black-box function at a point involves training a learning algorithm on a large data-set at a particular hyper-parameter and evaluating the validation error. Even a single such evaluation can be prohibitively expensive. Therefore, it is beneficial to use low-cost approximations, like training the learning algorithm on a sub-sampled version of the whole data-set. These low-cost approximations/fidelities can however provide a biased and noisy estimate of the function value. In this work, we combine structured state-space exploration through hierarchical partitioning with querying these partitions at multiple fidelities, and develop a multi-fidelity bandit based tree-search algorithm for noisy black-box optimization. We derive simple regret guarantees for our algorithm and validate its performance on real and synthetic datasets.
APA
Sen, R., Kandasamy, K. & Shakkottai, S.. (2019). Noisy Blackbox Optimization using Multi-fidelity Queries: A Tree Search Approach. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:2096-2105 Available from https://proceedings.mlr.press/v89/sen19a.html.

Related Material