Improved Margin Generalization Bounds for Voting Classifiers

Mikael Høgsgaard Møller, Kasper Green Larsen
Proceedings of Thirty Eighth Conference on Learning Theory, PMLR 291:2822-2855, 2025.

Abstract

In this paper we establish a new margin-based generalization bound for voting classifiers, refining existing results and yielding tighter generalization guarantees for widely used boosting algorithms such as AdaBoost Freund and Schapire (1997). Furthermore, the new margin-based generalization bound enables the derivation of an optimal weak-to-strong learner: a Majority-of-3 large-margin classifiers with an expected error matching the theoretical lower bound. This result provides a more natural alternative to the Majority-of-5 algorithm by H\{o}gsgaard et al. (2024), and matches the Majority-of-3 result by Aden-Ali et al. (2024) for the realizable prediction model.

Cite this Paper


BibTeX
@InProceedings{pmlr-v291-hogsgaard-moller25a, title = {Improved Margin Generalization Bounds for Voting Classifiers}, author = {H\o{}gsgaard M\o{}ller, Mikael and Green Larsen, Kasper}, booktitle = {Proceedings of Thirty Eighth Conference on Learning Theory}, pages = {2822--2855}, year = {2025}, editor = {Haghtalab, Nika and Moitra, Ankur}, volume = {291}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--04 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v291/main/assets/hogsgaard-moller25a/hogsgaard-moller25a.pdf}, url = {https://proceedings.mlr.press/v291/hogsgaard-moller25a.html}, abstract = {In this paper we establish a new margin-based generalization bound for voting classifiers, refining existing results and yielding tighter generalization guarantees for widely used boosting algorithms such as AdaBoost Freund and Schapire (1997). Furthermore, the new margin-based generalization bound enables the derivation of an optimal weak-to-strong learner: a Majority-of-3 large-margin classifiers with an expected error matching the theoretical lower bound. This result provides a more natural alternative to the Majority-of-5 algorithm by H\{o}gsgaard et al. (2024), and matches the Majority-of-3 result by Aden-Ali et al. (2024) for the realizable prediction model.} }
Endnote
%0 Conference Paper %T Improved Margin Generalization Bounds for Voting Classifiers %A Mikael Høgsgaard Møller %A Kasper Green Larsen %B Proceedings of Thirty Eighth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2025 %E Nika Haghtalab %E Ankur Moitra %F pmlr-v291-hogsgaard-moller25a %I PMLR %P 2822--2855 %U https://proceedings.mlr.press/v291/hogsgaard-moller25a.html %V 291 %X In this paper we establish a new margin-based generalization bound for voting classifiers, refining existing results and yielding tighter generalization guarantees for widely used boosting algorithms such as AdaBoost Freund and Schapire (1997). Furthermore, the new margin-based generalization bound enables the derivation of an optimal weak-to-strong learner: a Majority-of-3 large-margin classifiers with an expected error matching the theoretical lower bound. This result provides a more natural alternative to the Majority-of-5 algorithm by H\{o}gsgaard et al. (2024), and matches the Majority-of-3 result by Aden-Ali et al. (2024) for the realizable prediction model.
APA
Høgsgaard Møller, M. & Green Larsen, K.. (2025). Improved Margin Generalization Bounds for Voting Classifiers. Proceedings of Thirty Eighth Conference on Learning Theory, in Proceedings of Machine Learning Research 291:2822-2855 Available from https://proceedings.mlr.press/v291/hogsgaard-moller25a.html.

Related Material