Computable learning of natural hypothesis classes

Syed Akbari, Matthew Harrison-Trainor
Proceedings of Thirty Eighth Conference on Learning Theory, PMLR 291:2-21, 2025.

Abstract

This paper is about the recent notion of computably probably approximately correct learning, which lies between the statistical learning theory where there is no computational requirement on the learner and efficient PAC learning where the learner must be polynomially bounded. Examples have recently been given of hypothesis classes which are PAC-learnable but not computably PAC-learnable, but these hypothesis classes can be viewed as unnatural or non-canonical. We use the on-a-cone machinery from computability theory to prove that, under certain assumptions on the hypothesis class, any “natural” hypothesis class which is learnable must be computably learnable.

Cite this Paper


BibTeX
@InProceedings{pmlr-v291-akbari25a, title = {Computable learning of natural hypothesis classes}, author = {Akbari, Syed and Harrison-Trainor, Matthew}, booktitle = {Proceedings of Thirty Eighth Conference on Learning Theory}, pages = {2--21}, year = {2025}, editor = {Haghtalab, Nika and Moitra, Ankur}, volume = {291}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--04 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v291/main/assets/akbari25a/akbari25a.pdf}, url = {https://proceedings.mlr.press/v291/akbari25a.html}, abstract = {This paper is about the recent notion of computably probably approximately correct learning, which lies between the statistical learning theory where there is no computational requirement on the learner and efficient PAC learning where the learner must be polynomially bounded. Examples have recently been given of hypothesis classes which are PAC-learnable but not computably PAC-learnable, but these hypothesis classes can be viewed as unnatural or non-canonical. We use the on-a-cone machinery from computability theory to prove that, under certain assumptions on the hypothesis class, any “natural” hypothesis class which is learnable must be computably learnable.} }
Endnote
%0 Conference Paper %T Computable learning of natural hypothesis classes %A Syed Akbari %A Matthew Harrison-Trainor %B Proceedings of Thirty Eighth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2025 %E Nika Haghtalab %E Ankur Moitra %F pmlr-v291-akbari25a %I PMLR %P 2--21 %U https://proceedings.mlr.press/v291/akbari25a.html %V 291 %X This paper is about the recent notion of computably probably approximately correct learning, which lies between the statistical learning theory where there is no computational requirement on the learner and efficient PAC learning where the learner must be polynomially bounded. Examples have recently been given of hypothesis classes which are PAC-learnable but not computably PAC-learnable, but these hypothesis classes can be viewed as unnatural or non-canonical. We use the on-a-cone machinery from computability theory to prove that, under certain assumptions on the hypothesis class, any “natural” hypothesis class which is learnable must be computably learnable.
APA
Akbari, S. & Harrison-Trainor, M.. (2025). Computable learning of natural hypothesis classes. Proceedings of Thirty Eighth Conference on Learning Theory, in Proceedings of Machine Learning Research 291:2-21 Available from https://proceedings.mlr.press/v291/akbari25a.html.

Related Material