On Learnability wih Computable Learners

Sushant Agarwal, Nivasini Ananthakrishnan, Shai Ben-David, Tosca Lechner, Ruth Urner
Proceedings of the 31st International Conference on Algorithmic Learning Theory, PMLR 117:48-60, 2020.

Abstract

We initiate a study of learning with computable learners and computable output predictors. Recent results in statistical learning theory have shown that there are basic learning problems whose learnability can not be determined within ZFC. This motivates us to consider learnability by algorithms with computable output predictors (both learners and predictors are then representable as finite objects). We thus propose the notion of CPAC learnability, by adding some basic computability requirements into a PAC learning framework. As a first step towards a characterization, we show that in this framework learnability of a binary hypothesis class is not implied by finiteness of its VC-dimension anymore. We also present some situations where we are guaranteed to have a computable learner.

Cite this Paper


BibTeX
@InProceedings{pmlr-v117-agarwal20b, title = {On Learnability wih Computable Learners}, author = {Agarwal, Sushant and Ananthakrishnan, Nivasini and Ben-David, Shai and Lechner, Tosca and Urner, Ruth}, booktitle = {Proceedings of the 31st International Conference on Algorithmic Learning Theory}, pages = {48--60}, year = {2020}, editor = {Kontorovich, Aryeh and Neu, Gergely}, volume = {117}, series = {Proceedings of Machine Learning Research}, month = {08 Feb--11 Feb}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v117/agarwal20b/agarwal20b.pdf}, url = {https://proceedings.mlr.press/v117/agarwal20b.html}, abstract = {We initiate a study of learning with computable learners and computable output predictors. Recent results in statistical learning theory have shown that there are basic learning problems whose learnability can not be determined within ZFC. This motivates us to consider learnability by algorithms with computable output predictors (both learners and predictors are then representable as finite objects). We thus propose the notion of CPAC learnability, by adding some basic computability requirements into a PAC learning framework. As a first step towards a characterization, we show that in this framework learnability of a binary hypothesis class is not implied by finiteness of its VC-dimension anymore. We also present some situations where we are guaranteed to have a computable learner.} }
Endnote
%0 Conference Paper %T On Learnability wih Computable Learners %A Sushant Agarwal %A Nivasini Ananthakrishnan %A Shai Ben-David %A Tosca Lechner %A Ruth Urner %B Proceedings of the 31st International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2020 %E Aryeh Kontorovich %E Gergely Neu %F pmlr-v117-agarwal20b %I PMLR %P 48--60 %U https://proceedings.mlr.press/v117/agarwal20b.html %V 117 %X We initiate a study of learning with computable learners and computable output predictors. Recent results in statistical learning theory have shown that there are basic learning problems whose learnability can not be determined within ZFC. This motivates us to consider learnability by algorithms with computable output predictors (both learners and predictors are then representable as finite objects). We thus propose the notion of CPAC learnability, by adding some basic computability requirements into a PAC learning framework. As a first step towards a characterization, we show that in this framework learnability of a binary hypothesis class is not implied by finiteness of its VC-dimension anymore. We also present some situations where we are guaranteed to have a computable learner.
APA
Agarwal, S., Ananthakrishnan, N., Ben-David, S., Lechner, T. & Urner, R.. (2020). On Learnability wih Computable Learners. Proceedings of the 31st International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 117:48-60 Available from https://proceedings.mlr.press/v117/agarwal20b.html.

Related Material