Noisy Blackbox Optimization using Multifidelity Queries: A Tree Search Approach
[edit]
Proceedings of Machine Learning Research, PMLR 89:20962105, 2019.
Abstract
We study the problem of blackbox optimization of a noisy function in the presence of lowcost approximations or fidelities, which is motivated by problems like hyperparameter tuning. In hyperparameter tuning evaluating the blackbox function at a point involves training a learning algorithm on a large dataset at a particular hyperparameter and evaluating the validation error. Even a single such evaluation can be prohibitively expensive. Therefore, it is beneficial to use lowcost approximations, like training the learning algorithm on a subsampled version of the whole dataset. These lowcost approximations/fidelities can however provide a biased and noisy estimate of the function value. In this work, we combine structured statespace exploration through hierarchical partitioning with querying these partitions at multiple fidelities, and develop a multifidelity bandit based treesearch algorithm for noisy blackbox optimization. We derive simple regret guarantees for our algorithm and validate its performance on real and synthetic datasets.
Related Material


