A Unified Theory of Supervised Online Learnability

Vinod Raman, Unique Subedi, Ambuj Tewari
Proceedings of The 36th International Conference on Algorithmic Learning Theory, PMLR 272:985-1007, 2025.

Abstract

We study the online learnability of hypothesis classes with respect to arbitrary, but bounded loss functions. No characterization of online learnability is known at this level of generality. In this paper, we close this gap by showing that existing techniques can be used to characterize any online learning problem with a bounded loss function. Along the way, we give a new scale-sensitive combinatorial dimension, named the Sequential Minimax dimension, that generalizes all existing dimensions in online learning theory and provides upper and lower bounds on the minimax value.

Cite this Paper


BibTeX
@InProceedings{pmlr-v272-raman25a, title = {A Unified Theory of Supervised Online Learnability}, author = {Raman, Vinod and Subedi, Unique and Tewari, Ambuj}, booktitle = {Proceedings of The 36th International Conference on Algorithmic Learning Theory}, pages = {985--1007}, year = {2025}, editor = {Kamath, Gautam and Loh, Po-Ling}, volume = {272}, series = {Proceedings of Machine Learning Research}, month = {24--27 Feb}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v272/main/assets/raman25a/raman25a.pdf}, url = {https://proceedings.mlr.press/v272/raman25a.html}, abstract = {We study the online learnability of hypothesis classes with respect to arbitrary, but bounded loss functions. No characterization of online learnability is known at this level of generality. In this paper, we close this gap by showing that existing techniques can be used to characterize any online learning problem with a bounded loss function. Along the way, we give a new scale-sensitive combinatorial dimension, named the Sequential Minimax dimension, that generalizes all existing dimensions in online learning theory and provides upper and lower bounds on the minimax value.} }
Endnote
%0 Conference Paper %T A Unified Theory of Supervised Online Learnability %A Vinod Raman %A Unique Subedi %A Ambuj Tewari %B Proceedings of The 36th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2025 %E Gautam Kamath %E Po-Ling Loh %F pmlr-v272-raman25a %I PMLR %P 985--1007 %U https://proceedings.mlr.press/v272/raman25a.html %V 272 %X We study the online learnability of hypothesis classes with respect to arbitrary, but bounded loss functions. No characterization of online learnability is known at this level of generality. In this paper, we close this gap by showing that existing techniques can be used to characterize any online learning problem with a bounded loss function. Along the way, we give a new scale-sensitive combinatorial dimension, named the Sequential Minimax dimension, that generalizes all existing dimensions in online learning theory and provides upper and lower bounds on the minimax value.
APA
Raman, V., Subedi, U. & Tewari, A.. (2025). A Unified Theory of Supervised Online Learnability. Proceedings of The 36th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 272:985-1007 Available from https://proceedings.mlr.press/v272/raman25a.html.

Related Material