A Novel Data-Dependent Learning Paradigm for Large Hypothesis Classes

Alireza F. Pour, Shai Ben-David
Proceedings of The 37th International Conference on Algorithmic Learning Theory, PMLR 313:1-27, 2026.

Abstract

We address the general task of learning with a set of candidate models that is too large to have a uniform convergence of empirical estimates to true losses. While the common approach to such challenges is SRM (or regularization) based learning algorithms, we propose a novel learning paradigm that relies on stronger incorporation of empirical data and requires less algorithmic decisions to be based on prior assumptions. We analyze the generalization capabilities of our approach and demonstrate its merits in several common learning assumptions, including similarity of close points, clustering of the domain into highly label-homogeneous regions, Lipschitzness assumptions of the labeling rule, and contrastive learning assumptions. Our approach allows utilizing such assumptions without the need to know their true parameters a priori.

Cite this Paper


BibTeX
@InProceedings{pmlr-v313-pour26a, title = {A Novel Data-Dependent Learning Paradigm for Large Hypothesis Classes}, author = {Pour, Alireza F. and Ben-David, Shai}, booktitle = {Proceedings of The 37th International Conference on Algorithmic Learning Theory}, pages = {1--27}, year = {2026}, editor = {Telgarsky, Matus and Ullman, Jonathan}, volume = {313}, series = {Proceedings of Machine Learning Research}, month = {23--26 Feb}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v313/main/assets/pour26a/pour26a.pdf}, url = {https://proceedings.mlr.press/v313/pour26a.html}, abstract = {We address the general task of learning with a set of candidate models that is too large to have a uniform convergence of empirical estimates to true losses. While the common approach to such challenges is SRM (or regularization) based learning algorithms, we propose a novel learning paradigm that relies on stronger incorporation of empirical data and requires less algorithmic decisions to be based on prior assumptions. We analyze the generalization capabilities of our approach and demonstrate its merits in several common learning assumptions, including similarity of close points, clustering of the domain into highly label-homogeneous regions, Lipschitzness assumptions of the labeling rule, and contrastive learning assumptions. Our approach allows utilizing such assumptions without the need to know their true parameters a priori.} }
Endnote
%0 Conference Paper %T A Novel Data-Dependent Learning Paradigm for Large Hypothesis Classes %A Alireza F. Pour %A Shai Ben-David %B Proceedings of The 37th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2026 %E Matus Telgarsky %E Jonathan Ullman %F pmlr-v313-pour26a %I PMLR %P 1--27 %U https://proceedings.mlr.press/v313/pour26a.html %V 313 %X We address the general task of learning with a set of candidate models that is too large to have a uniform convergence of empirical estimates to true losses. While the common approach to such challenges is SRM (or regularization) based learning algorithms, we propose a novel learning paradigm that relies on stronger incorporation of empirical data and requires less algorithmic decisions to be based on prior assumptions. We analyze the generalization capabilities of our approach and demonstrate its merits in several common learning assumptions, including similarity of close points, clustering of the domain into highly label-homogeneous regions, Lipschitzness assumptions of the labeling rule, and contrastive learning assumptions. Our approach allows utilizing such assumptions without the need to know their true parameters a priori.
APA
Pour, A.F. & Ben-David, S.. (2026). A Novel Data-Dependent Learning Paradigm for Large Hypothesis Classes. Proceedings of The 37th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 313:1-27 Available from https://proceedings.mlr.press/v313/pour26a.html.

Related Material