[edit]
A Unified Theory of Supervised Online Learnability
Proceedings of The 36th International Conference on Algorithmic Learning Theory, PMLR 272:985-1007, 2025.
Abstract
We study the online learnability of hypothesis classes with respect to arbitrary, but bounded loss functions. No characterization of online learnability is known at this level of generality. In this paper, we close this gap by showing that existing techniques can be used to characterize any online learning problem with a bounded loss function. Along the way, we give a new scale-sensitive combinatorial dimension, named the Sequential Minimax dimension, that generalizes all existing dimensions in online learning theory and provides upper and lower bounds on the minimax value.