Open Problem: Information Complexity of VC Learning

Thomas Steinke, Lydia Zakynthinou
Proceedings of Thirty Third Conference on Learning Theory, PMLR 125:3857-3863, 2020.

Abstract

Uniform convergence approaches learning by studying the complexity of hypothesis classes. In particular, hypothesis classes with bounded Vapnik-Chervonenkis dimension exhibit strong uniform convergence, that is, any hypothesis in the class has low generalization error. On the other hand, a long line of work studies the information complexity of a learning algorithm, as it is connected to several desired properties, including generalization. We ask whether all VC classes admit a learner with low information complexity which achieves the generalization bounds guaranteed by uniform convergence. Specifically, since we know that this is not possible if we consider proper and consistent learners and measure information complexity in terms of the mutual information (Bassily et al., 2018), we are interested in learners with low information complexity measured in terms of the recently introduced notion of CMI (Steinke and Zakynthinou, 2020). Can we obtain tight bounds on the information complexity of a learning algorithm for a VC class (via CMI), thus exactly retrieving the known generalization bounds implied for this class by uniform convergence?

Cite this Paper


BibTeX
@InProceedings{pmlr-v125-steinke20b, title = {{O}pen {P}roblem: {I}nformation {C}omplexity of {VC} {L}earning}, author = {Steinke, Thomas and Zakynthinou, Lydia}, booktitle = {Proceedings of Thirty Third Conference on Learning Theory}, pages = {3857--3863}, year = {2020}, editor = {Abernethy, Jacob and Agarwal, Shivani}, volume = {125}, series = {Proceedings of Machine Learning Research}, month = {09--12 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v125/steinke20b/steinke20b.pdf}, url = {https://proceedings.mlr.press/v125/steinke20b.html}, abstract = {Uniform convergence approaches learning by studying the complexity of hypothesis classes. In particular, hypothesis classes with bounded Vapnik-Chervonenkis dimension exhibit strong uniform convergence, that is, any hypothesis in the class has low generalization error. On the other hand, a long line of work studies the information complexity of a learning algorithm, as it is connected to several desired properties, including generalization. We ask whether all VC classes admit a learner with low information complexity which achieves the generalization bounds guaranteed by uniform convergence. Specifically, since we know that this is not possible if we consider proper and consistent learners and measure information complexity in terms of the mutual information (Bassily et al., 2018), we are interested in learners with low information complexity measured in terms of the recently introduced notion of CMI (Steinke and Zakynthinou, 2020). Can we obtain tight bounds on the information complexity of a learning algorithm for a VC class (via CMI), thus exactly retrieving the known generalization bounds implied for this class by uniform convergence?} }
Endnote
%0 Conference Paper %T Open Problem: Information Complexity of VC Learning %A Thomas Steinke %A Lydia Zakynthinou %B Proceedings of Thirty Third Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2020 %E Jacob Abernethy %E Shivani Agarwal %F pmlr-v125-steinke20b %I PMLR %P 3857--3863 %U https://proceedings.mlr.press/v125/steinke20b.html %V 125 %X Uniform convergence approaches learning by studying the complexity of hypothesis classes. In particular, hypothesis classes with bounded Vapnik-Chervonenkis dimension exhibit strong uniform convergence, that is, any hypothesis in the class has low generalization error. On the other hand, a long line of work studies the information complexity of a learning algorithm, as it is connected to several desired properties, including generalization. We ask whether all VC classes admit a learner with low information complexity which achieves the generalization bounds guaranteed by uniform convergence. Specifically, since we know that this is not possible if we consider proper and consistent learners and measure information complexity in terms of the mutual information (Bassily et al., 2018), we are interested in learners with low information complexity measured in terms of the recently introduced notion of CMI (Steinke and Zakynthinou, 2020). Can we obtain tight bounds on the information complexity of a learning algorithm for a VC class (via CMI), thus exactly retrieving the known generalization bounds implied for this class by uniform convergence?
APA
Steinke, T. & Zakynthinou, L.. (2020). Open Problem: Information Complexity of VC Learning. Proceedings of Thirty Third Conference on Learning Theory, in Proceedings of Machine Learning Research 125:3857-3863 Available from https://proceedings.mlr.press/v125/steinke20b.html.

Related Material