Simple online learning with consistent oracle

Alexander Kozachinskiy, Tomasz Steifer
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:3241-3256, 2024.

Abstract

We consider online learning in the model where a learning algorithm can access the class only via the \emph{consistent oracle}—an oracle, that, at any moment, can give a function from the class that agrees with all examples seen so far. This model was recently considered by Assos et al. (COLT’23). It is motivated by the fact that standard methods of online learning rely on computing the Littlestone dimension of subclasses, a computationally intractable problem. Assos et al. gave an online learning algorithm in this model that makes at most $C^d$ mistakes on classes of Littlestone dimension $d$, for some absolute unspecified constant $C > 0$. We give a novel algorithm that makes at most $O(256^d)$ mistakes. Our proof is significantly simpler and uses only very basic properties of the Littlestone dimension. We also show that there exists no algorithm in this model that makes less than $3^d$ mistakes. Our algorithm (as well as the algorithm of Assos et al.) solves an open problem by Hasrati and Ben-David (ALT’23). Namely, it demonstrates that every class of finite Littlestone dimension with recursively enumerable representation admits a computable online learner (that may be undefined on unrealizable samples).

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-kozachinskiy24a, title = {Simple online learning with consistent oracle}, author = {Kozachinskiy, Alexander and Steifer, Tomasz}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {3241--3256}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/kozachinskiy24a/kozachinskiy24a.pdf}, url = {https://proceedings.mlr.press/v247/kozachinskiy24a.html}, abstract = {We consider online learning in the model where a learning algorithm can access the class only via the \emph{consistent oracle}—an oracle, that, at any moment, can give a function from the class that agrees with all examples seen so far. This model was recently considered by Assos et al. (COLT’23). It is motivated by the fact that standard methods of online learning rely on computing the Littlestone dimension of subclasses, a computationally intractable problem. Assos et al. gave an online learning algorithm in this model that makes at most $C^d$ mistakes on classes of Littlestone dimension $d$, for some absolute unspecified constant $C > 0$. We give a novel algorithm that makes at most $O(256^d)$ mistakes. Our proof is significantly simpler and uses only very basic properties of the Littlestone dimension. We also show that there exists no algorithm in this model that makes less than $3^d$ mistakes. Our algorithm (as well as the algorithm of Assos et al.) solves an open problem by Hasrati and Ben-David (ALT’23). Namely, it demonstrates that every class of finite Littlestone dimension with recursively enumerable representation admits a computable online learner (that may be undefined on unrealizable samples). } }
Endnote
%0 Conference Paper %T Simple online learning with consistent oracle %A Alexander Kozachinskiy %A Tomasz Steifer %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-kozachinskiy24a %I PMLR %P 3241--3256 %U https://proceedings.mlr.press/v247/kozachinskiy24a.html %V 247 %X We consider online learning in the model where a learning algorithm can access the class only via the \emph{consistent oracle}—an oracle, that, at any moment, can give a function from the class that agrees with all examples seen so far. This model was recently considered by Assos et al. (COLT’23). It is motivated by the fact that standard methods of online learning rely on computing the Littlestone dimension of subclasses, a computationally intractable problem. Assos et al. gave an online learning algorithm in this model that makes at most $C^d$ mistakes on classes of Littlestone dimension $d$, for some absolute unspecified constant $C > 0$. We give a novel algorithm that makes at most $O(256^d)$ mistakes. Our proof is significantly simpler and uses only very basic properties of the Littlestone dimension. We also show that there exists no algorithm in this model that makes less than $3^d$ mistakes. Our algorithm (as well as the algorithm of Assos et al.) solves an open problem by Hasrati and Ben-David (ALT’23). Namely, it demonstrates that every class of finite Littlestone dimension with recursively enumerable representation admits a computable online learner (that may be undefined on unrealizable samples).
APA
Kozachinskiy, A. & Steifer, T.. (2024). Simple online learning with consistent oracle. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:3241-3256 Available from https://proceedings.mlr.press/v247/kozachinskiy24a.html.

Related Material