A Simple Multi-Class Boosting Framework with Theoretical Guarantees and Empirical Proficiency

Ron Appel, Pietro Perona
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:186-194, 2017.

Abstract

There is a need for simple yet accurate white-box learning systems that train quickly and with little data. To this end, we showcase REBEL, a multi-class boosting method, and present a novel family of weak learners called localized similarities. Our framework provably minimizes the training error of any dataset at an exponential rate. We carry out experiments on a variety of synthetic and real datasets, demonstrating a consistent tendency to avoid overfitting. We evaluate our method on MNIST and standard UCI datasets against other state-of-the-art methods, showing the empirical proficiency of our method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-appel17a, title = {A Simple Multi-Class Boosting Framework with Theoretical Guarantees and Empirical Proficiency}, author = {Ron Appel and Pietro Perona}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {186--194}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/appel17a/appel17a.pdf}, url = {https://proceedings.mlr.press/v70/appel17a.html}, abstract = {There is a need for simple yet accurate white-box learning systems that train quickly and with little data. To this end, we showcase REBEL, a multi-class boosting method, and present a novel family of weak learners called localized similarities. Our framework provably minimizes the training error of any dataset at an exponential rate. We carry out experiments on a variety of synthetic and real datasets, demonstrating a consistent tendency to avoid overfitting. We evaluate our method on MNIST and standard UCI datasets against other state-of-the-art methods, showing the empirical proficiency of our method.} }
Endnote
%0 Conference Paper %T A Simple Multi-Class Boosting Framework with Theoretical Guarantees and Empirical Proficiency %A Ron Appel %A Pietro Perona %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-appel17a %I PMLR %P 186--194 %U https://proceedings.mlr.press/v70/appel17a.html %V 70 %X There is a need for simple yet accurate white-box learning systems that train quickly and with little data. To this end, we showcase REBEL, a multi-class boosting method, and present a novel family of weak learners called localized similarities. Our framework provably minimizes the training error of any dataset at an exponential rate. We carry out experiments on a variety of synthetic and real datasets, demonstrating a consistent tendency to avoid overfitting. We evaluate our method on MNIST and standard UCI datasets against other state-of-the-art methods, showing the empirical proficiency of our method.
APA
Appel, R. & Perona, P.. (2017). A Simple Multi-Class Boosting Framework with Theoretical Guarantees and Empirical Proficiency. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:186-194 Available from https://proceedings.mlr.press/v70/appel17a.html.

Related Material