Bayesian Optimization is Superior to Random Search for Machine Learning Hyperparameter Tuning: Analysis of the Black-Box Optimization Challenge 2020

Ryan Turner, David Eriksson, Michael McCourt, Juha Kiili, Eero Laaksonen, Zhen Xu, Isabelle Guyon
Proceedings of the NeurIPS 2020 Competition and Demonstration Track, PMLR 133:3-26, 2021.

Abstract

This paper presents the results and insights from the black-box optimization (BBO) challenge at NeurIPS2020 which ran from July–October, 2020. The challenge emphasized the importance of evaluating derivative-free optimizers for tuning the hyperparameters of machine learning models. This was the first black-box optimization challenge with a machine learning emphasis. It was based on tuning (validation set) performance of standard machine learning models on real datasets. This competition has widespread impact as black-box optimization (e.g., Bayesian optimization) is relevant for hyperparameter tuning in almost every machine learning project as well as many applications outside of machine learning. The final leaderboard was determined using the optimization performance on held-out (hidden) objective functions, where the optimizers ran without human intervention. Baselines were set using the default settings of several open source black-box optimization packages as well as random search.

Cite this Paper


BibTeX
@InProceedings{pmlr-v133-turner21a, title = {Bayesian Optimization is Superior to Random Search for Machine Learning Hyperparameter Tuning: Analysis of the Black-Box Optimization Challenge 2020}, author = {Turner, Ryan and Eriksson, David and McCourt, Michael and Kiili, Juha and Laaksonen, Eero and Xu, Zhen and Guyon, Isabelle}, booktitle = {Proceedings of the NeurIPS 2020 Competition and Demonstration Track}, pages = {3--26}, year = {2021}, editor = {Escalante, Hugo Jair and Hofmann, Katja}, volume = {133}, series = {Proceedings of Machine Learning Research}, month = {06--12 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v133/turner21a/turner21a.pdf}, url = {https://proceedings.mlr.press/v133/turner21a.html}, abstract = {This paper presents the results and insights from the black-box optimization (BBO) challenge at NeurIPS2020 which ran from July–October, 2020. The challenge emphasized the importance of evaluating derivative-free optimizers for tuning the hyperparameters of machine learning models. This was the first black-box optimization challenge with a machine learning emphasis. It was based on tuning (validation set) performance of standard machine learning models on real datasets. This competition has widespread impact as black-box optimization (e.g., Bayesian optimization) is relevant for hyperparameter tuning in almost every machine learning project as well as many applications outside of machine learning. The final leaderboard was determined using the optimization performance on held-out (hidden) objective functions, where the optimizers ran without human intervention. Baselines were set using the default settings of several open source black-box optimization packages as well as random search.} }
Endnote
%0 Conference Paper %T Bayesian Optimization is Superior to Random Search for Machine Learning Hyperparameter Tuning: Analysis of the Black-Box Optimization Challenge 2020 %A Ryan Turner %A David Eriksson %A Michael McCourt %A Juha Kiili %A Eero Laaksonen %A Zhen Xu %A Isabelle Guyon %B Proceedings of the NeurIPS 2020 Competition and Demonstration Track %C Proceedings of Machine Learning Research %D 2021 %E Hugo Jair Escalante %E Katja Hofmann %F pmlr-v133-turner21a %I PMLR %P 3--26 %U https://proceedings.mlr.press/v133/turner21a.html %V 133 %X This paper presents the results and insights from the black-box optimization (BBO) challenge at NeurIPS2020 which ran from July–October, 2020. The challenge emphasized the importance of evaluating derivative-free optimizers for tuning the hyperparameters of machine learning models. This was the first black-box optimization challenge with a machine learning emphasis. It was based on tuning (validation set) performance of standard machine learning models on real datasets. This competition has widespread impact as black-box optimization (e.g., Bayesian optimization) is relevant for hyperparameter tuning in almost every machine learning project as well as many applications outside of machine learning. The final leaderboard was determined using the optimization performance on held-out (hidden) objective functions, where the optimizers ran without human intervention. Baselines were set using the default settings of several open source black-box optimization packages as well as random search.
APA
Turner, R., Eriksson, D., McCourt, M., Kiili, J., Laaksonen, E., Xu, Z. & Guyon, I.. (2021). Bayesian Optimization is Superior to Random Search for Machine Learning Hyperparameter Tuning: Analysis of the Black-Box Optimization Challenge 2020. Proceedings of the NeurIPS 2020 Competition and Demonstration Track, in Proceedings of Machine Learning Research 133:3-26 Available from https://proceedings.mlr.press/v133/turner21a.html.

Related Material