[edit]
The Impossibility of Parallelizing Boosting
Proceedings of The 35th International Conference on Algorithmic Learning Theory, PMLR 237:635-653, 2024.
Abstract
The aim of boosting is to convert a sequence of weak learners into a strong learner. At their heart, these methods are fully sequential. In this paper, we investigate the possibility of parallelizing boosting. Our main contribution is a strong negative result, implying that significant parallelization of boosting requires an exponential blow-up in the total computing resources needed for training.