MultiFidelity BlackBox Optimization with Hierarchical Partitions
[edit]
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:45384547, 2018.
Abstract
Motivated by settings such as hyperparameter tuning and physical simulations, we consider the problem of blackbox optimization of a function. Multifidelity techniques have become popular for applications where exact function evaluations are expensive, but coarse (biased) approximations are available at much lower cost. A canonical example is that of hyperparameter selection in a learning algorithm. The learning algorithm can be trained for fewer iterations – this results in a lower cost, but its validation error is only coarsely indicative of the same if the algorithm had been trained till completion. We incorporate the multifidelity setup into the powerful framework of blackbox optimization through hierarchical partitioning. We develop treesearch based multifidelity algorithms with theoretical guarantees on simple regret. We finally demonstrate the performance gains of our algorithms on both real and synthetic datasets.
Related Material


