Dissecting the Winning Solution of the HiggsML Challenge

Gábor Melis
; Proceedings of the NIPS 2014 Workshop on High-energy Physics and Machine Learning, PMLR 42:57-67, 2015.

Abstract

The recent Higgs Machine Learning Challenge pitted one of the largest crowds seen in machine learning contests against one another. In this paper, we present the winning solution and investigate the effect of extra features, the choice of neural network activation function, regularization and data set size. We demonstrate improved classification accuracy using a very similar network architecture on the permutation invariant MNIST benchmark. Furthermore, we advocate the use of a simple method that lies on the boundary between bagging and cross-validation to both estimate the generalization error and improve accuracy.

Cite this Paper


BibTeX
@InProceedings{pmlr-v42-meli14, title = {Dissecting the Winning Solution of the HiggsML Challenge}, author = {Gábor Melis}, booktitle = {Proceedings of the NIPS 2014 Workshop on High-energy Physics and Machine Learning}, pages = {57--67}, year = {2015}, editor = {Glen Cowan and Cécile Germain and Isabelle Guyon and Balázs Kégl and David Rousseau}, volume = {42}, series = {Proceedings of Machine Learning Research}, address = {Montreal, Canada}, month = {13 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v42/meli14.pdf}, url = {http://proceedings.mlr.press/v42/meli14.html}, abstract = {The recent Higgs Machine Learning Challenge pitted one of the largest crowds seen in machine learning contests against one another. In this paper, we present the winning solution and investigate the effect of extra features, the choice of neural network activation function, regularization and data set size. We demonstrate improved classification accuracy using a very similar network architecture on the permutation invariant MNIST benchmark. Furthermore, we advocate the use of a simple method that lies on the boundary between bagging and cross-validation to both estimate the generalization error and improve accuracy.} }
Endnote
%0 Conference Paper %T Dissecting the Winning Solution of the HiggsML Challenge %A Gábor Melis %B Proceedings of the NIPS 2014 Workshop on High-energy Physics and Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Glen Cowan %E Cécile Germain %E Isabelle Guyon %E Balázs Kégl %E David Rousseau %F pmlr-v42-meli14 %I PMLR %J Proceedings of Machine Learning Research %P 57--67 %U http://proceedings.mlr.press %V 42 %W PMLR %X The recent Higgs Machine Learning Challenge pitted one of the largest crowds seen in machine learning contests against one another. In this paper, we present the winning solution and investigate the effect of extra features, the choice of neural network activation function, regularization and data set size. We demonstrate improved classification accuracy using a very similar network architecture on the permutation invariant MNIST benchmark. Furthermore, we advocate the use of a simple method that lies on the boundary between bagging and cross-validation to both estimate the generalization error and improve accuracy.
RIS
TY - CPAPER TI - Dissecting the Winning Solution of the HiggsML Challenge AU - Gábor Melis BT - Proceedings of the NIPS 2014 Workshop on High-energy Physics and Machine Learning PY - 2015/08/27 DA - 2015/08/27 ED - Glen Cowan ED - Cécile Germain ED - Isabelle Guyon ED - Balázs Kégl ED - David Rousseau ID - pmlr-v42-meli14 PB - PMLR SP - 57 DP - PMLR EP - 67 L1 - http://proceedings.mlr.press/v42/meli14.pdf UR - http://proceedings.mlr.press/v42/meli14.html AB - The recent Higgs Machine Learning Challenge pitted one of the largest crowds seen in machine learning contests against one another. In this paper, we present the winning solution and investigate the effect of extra features, the choice of neural network activation function, regularization and data set size. We demonstrate improved classification accuracy using a very similar network architecture on the permutation invariant MNIST benchmark. Furthermore, we advocate the use of a simple method that lies on the boundary between bagging and cross-validation to both estimate the generalization error and improve accuracy. ER -
APA
Melis, G.. (2015). Dissecting the Winning Solution of the HiggsML Challenge. Proceedings of the NIPS 2014 Workshop on High-energy Physics and Machine Learning, in PMLR 42:57-67

Related Material