Optimizing Hyperparameters with Conformal Quantile Regression

David Salinas, Jacek Golebiowski, Aaron Klein, Matthias Seeger, Cedric Archambeau
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:29876-29893, 2023.

Abstract

Many state-of-the-art hyperparameter optimization (HPO) algorithms rely on model-based optimizers that learn surrogate models of the target function to guide the search. Gaussian processes are the de facto surrogate model due to their ability to capture uncertainty. However, they make strong assumptions about the observation noise, which might not be warranted in practice. In this work, we propose to leverage conformalized quantile regression which makes minimal assumptions about the observation noise and, as a result, models the target function in a more realistic and robust fashion which translates to quicker HPO convergence on empirical benchmarks. To apply our method in a multi-fidelity setting, we propose a simple, yet effective, technique that aggregates observed results across different resource levels and outperforms conventional methods across many empirical tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-salinas23a, title = {Optimizing Hyperparameters with Conformal Quantile Regression}, author = {Salinas, David and Golebiowski, Jacek and Klein, Aaron and Seeger, Matthias and Archambeau, Cedric}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {29876--29893}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/salinas23a/salinas23a.pdf}, url = {https://proceedings.mlr.press/v202/salinas23a.html}, abstract = {Many state-of-the-art hyperparameter optimization (HPO) algorithms rely on model-based optimizers that learn surrogate models of the target function to guide the search. Gaussian processes are the de facto surrogate model due to their ability to capture uncertainty. However, they make strong assumptions about the observation noise, which might not be warranted in practice. In this work, we propose to leverage conformalized quantile regression which makes minimal assumptions about the observation noise and, as a result, models the target function in a more realistic and robust fashion which translates to quicker HPO convergence on empirical benchmarks. To apply our method in a multi-fidelity setting, we propose a simple, yet effective, technique that aggregates observed results across different resource levels and outperforms conventional methods across many empirical tasks.} }
Endnote
%0 Conference Paper %T Optimizing Hyperparameters with Conformal Quantile Regression %A David Salinas %A Jacek Golebiowski %A Aaron Klein %A Matthias Seeger %A Cedric Archambeau %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-salinas23a %I PMLR %P 29876--29893 %U https://proceedings.mlr.press/v202/salinas23a.html %V 202 %X Many state-of-the-art hyperparameter optimization (HPO) algorithms rely on model-based optimizers that learn surrogate models of the target function to guide the search. Gaussian processes are the de facto surrogate model due to their ability to capture uncertainty. However, they make strong assumptions about the observation noise, which might not be warranted in practice. In this work, we propose to leverage conformalized quantile regression which makes minimal assumptions about the observation noise and, as a result, models the target function in a more realistic and robust fashion which translates to quicker HPO convergence on empirical benchmarks. To apply our method in a multi-fidelity setting, we propose a simple, yet effective, technique that aggregates observed results across different resource levels and outperforms conventional methods across many empirical tasks.
APA
Salinas, D., Golebiowski, J., Klein, A., Seeger, M. & Archambeau, C.. (2023). Optimizing Hyperparameters with Conformal Quantile Regression. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:29876-29893 Available from https://proceedings.mlr.press/v202/salinas23a.html.

Related Material