The Impossibility of Parallelizing Boosting

Amin Karbasi, Kasper Green Larsen
Proceedings of The 35th International Conference on Algorithmic Learning Theory, PMLR 237:635-653, 2024.

Abstract

The aim of boosting is to convert a sequence of weak learners into a strong learner. At their heart, these methods are fully sequential. In this paper, we investigate the possibility of parallelizing boosting. Our main contribution is a strong negative result, implying that significant parallelization of boosting requires an exponential blow-up in the total computing resources needed for training.

Cite this Paper


BibTeX
@InProceedings{pmlr-v237-karbasi24a, title = {The Impossibility of Parallelizing Boosting}, author = {Karbasi, Amin and Green Larsen, Kasper}, booktitle = {Proceedings of The 35th International Conference on Algorithmic Learning Theory}, pages = {635--653}, year = {2024}, editor = {Vernade, Claire and Hsu, Daniel}, volume = {237}, series = {Proceedings of Machine Learning Research}, month = {25--28 Feb}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v237/karbasi24a/karbasi24a.pdf}, url = {https://proceedings.mlr.press/v237/karbasi24a.html}, abstract = {The aim of boosting is to convert a sequence of weak learners into a strong learner. At their heart, these methods are fully sequential. In this paper, we investigate the possibility of parallelizing boosting. Our main contribution is a strong negative result, implying that significant parallelization of boosting requires an exponential blow-up in the total computing resources needed for training.} }
Endnote
%0 Conference Paper %T The Impossibility of Parallelizing Boosting %A Amin Karbasi %A Kasper Green Larsen %B Proceedings of The 35th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Claire Vernade %E Daniel Hsu %F pmlr-v237-karbasi24a %I PMLR %P 635--653 %U https://proceedings.mlr.press/v237/karbasi24a.html %V 237 %X The aim of boosting is to convert a sequence of weak learners into a strong learner. At their heart, these methods are fully sequential. In this paper, we investigate the possibility of parallelizing boosting. Our main contribution is a strong negative result, implying that significant parallelization of boosting requires an exponential blow-up in the total computing resources needed for training.
APA
Karbasi, A. & Green Larsen, K.. (2024). The Impossibility of Parallelizing Boosting. Proceedings of The 35th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 237:635-653 Available from https://proceedings.mlr.press/v237/karbasi24a.html.

Related Material