Bayesian optimization for automated model selection

Gustavo Malkomes, Chip Schaff, Roman Garnett
Proceedings of the Workshop on Automatic Machine Learning, PMLR 64:41-47, 2016.

Abstract

Despite the success of kernel-based nonparametric methods, kernel selection still requires considerable expertise, and is often described as a “black art.” We present a sophisticated method for automatically searching for an appropriate kernel from an infinite space of potential choices. Previous efforts in this direction have focused on traversing a kernel grammar, only examining the data via computation of marginal likelihood. Our proposed search method is based on Bayesian optimization in model space, where we reason about model evidence as a function to be maximized. We explicitly reason about the data distribution and how it induces similarity between potential model choices in terms of the explanations they can offer for observed data. In this light, we construct a novel kernel between models to explain a given dataset. Our method is capable of finding a model that explains a given dataset well without any human assistance, often with fewer computatio! ns of model evidence than previous approaches, a claim we demonstrate empirically.

Cite this Paper


BibTeX
@InProceedings{pmlr-v64-malkomes_bayesian_2016, title = {Bayesian optimization for automated model selection}, author = {Malkomes, Gustavo and Schaff, Chip and Garnett, Roman}, booktitle = {Proceedings of the Workshop on Automatic Machine Learning}, pages = {41--47}, year = {2016}, editor = {Hutter, Frank and Kotthoff, Lars and Vanschoren, Joaquin}, volume = {64}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v64/malkomes_bayesian_2016.pdf}, url = {https://proceedings.mlr.press/v64/malkomes_bayesian_2016.html}, abstract = {Despite the success of kernel-based nonparametric methods, kernel selection still requires considerable expertise, and is often described as a “black art.” We present a sophisticated method for automatically searching for an appropriate kernel from an infinite space of potential choices. Previous efforts in this direction have focused on traversing a kernel grammar, only examining the data via computation of marginal likelihood. Our proposed search method is based on Bayesian optimization in model space, where we reason about model evidence as a function to be maximized. We explicitly reason about the data distribution and how it induces similarity between potential model choices in terms of the explanations they can offer for observed data. In this light, we construct a novel kernel between models to explain a given dataset. Our method is capable of finding a model that explains a given dataset well without any human assistance, often with fewer computatio! ns of model evidence than previous approaches, a claim we demonstrate empirically.} }
Endnote
%0 Conference Paper %T Bayesian optimization for automated model selection %A Gustavo Malkomes %A Chip Schaff %A Roman Garnett %B Proceedings of the Workshop on Automatic Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Frank Hutter %E Lars Kotthoff %E Joaquin Vanschoren %F pmlr-v64-malkomes_bayesian_2016 %I PMLR %P 41--47 %U https://proceedings.mlr.press/v64/malkomes_bayesian_2016.html %V 64 %X Despite the success of kernel-based nonparametric methods, kernel selection still requires considerable expertise, and is often described as a “black art.” We present a sophisticated method for automatically searching for an appropriate kernel from an infinite space of potential choices. Previous efforts in this direction have focused on traversing a kernel grammar, only examining the data via computation of marginal likelihood. Our proposed search method is based on Bayesian optimization in model space, where we reason about model evidence as a function to be maximized. We explicitly reason about the data distribution and how it induces similarity between potential model choices in terms of the explanations they can offer for observed data. In this light, we construct a novel kernel between models to explain a given dataset. Our method is capable of finding a model that explains a given dataset well without any human assistance, often with fewer computatio! ns of model evidence than previous approaches, a claim we demonstrate empirically.
RIS
TY - CPAPER TI - Bayesian optimization for automated model selection AU - Gustavo Malkomes AU - Chip Schaff AU - Roman Garnett BT - Proceedings of the Workshop on Automatic Machine Learning DA - 2016/12/04 ED - Frank Hutter ED - Lars Kotthoff ED - Joaquin Vanschoren ID - pmlr-v64-malkomes_bayesian_2016 PB - PMLR DP - Proceedings of Machine Learning Research VL - 64 SP - 41 EP - 47 L1 - http://proceedings.mlr.press/v64/malkomes_bayesian_2016.pdf UR - https://proceedings.mlr.press/v64/malkomes_bayesian_2016.html AB - Despite the success of kernel-based nonparametric methods, kernel selection still requires considerable expertise, and is often described as a “black art.” We present a sophisticated method for automatically searching for an appropriate kernel from an infinite space of potential choices. Previous efforts in this direction have focused on traversing a kernel grammar, only examining the data via computation of marginal likelihood. Our proposed search method is based on Bayesian optimization in model space, where we reason about model evidence as a function to be maximized. We explicitly reason about the data distribution and how it induces similarity between potential model choices in terms of the explanations they can offer for observed data. In this light, we construct a novel kernel between models to explain a given dataset. Our method is capable of finding a model that explains a given dataset well without any human assistance, often with fewer computatio! ns of model evidence than previous approaches, a claim we demonstrate empirically. ER -
APA
Malkomes, G., Schaff, C. & Garnett, R.. (2016). Bayesian optimization for automated model selection. Proceedings of the Workshop on Automatic Machine Learning, in Proceedings of Machine Learning Research 64:41-47 Available from https://proceedings.mlr.press/v64/malkomes_bayesian_2016.html.

Related Material