Rate-Distortion Theoretic Generalization Bounds for Stochastic Learning Algorithms

Milad Sefidgaran, Amin Gohari, Gaël Richard, Umut Simsekli
Proceedings of Thirty Fifth Conference on Learning Theory, PMLR 178:4416-4463, 2022.

Abstract

Understanding generalization in modern machine learning settings has been one of the major challenges in statistical learning theory. In this context, recent years have witnessed the development of various generalization bounds suggesting different complexity notions such as the mutual information between the data sample and the algorithm output, compressibility of the hypothesis space, and the fractal dimension of the hypothesis space. While these bounds have illuminated the problem at hand from different angles, their suggested complexity notions might appear seemingly unrelated, thereby restricting their high-level impact. In this study, we prove novel generalization bounds through the lens of rate-distortion theory, and explicitly relate the concepts of mutual information, compressibility, and fractal dimensions in a single mathematical framework. Our approach consists of (i) defining a generalized notion of compressibility by using source coding concepts, and (ii) showing that the ’compression error rate’ can be linked to the generalization error both in expectation and with high probability. We show that in the ’lossless compression’ setting, we recover and improve existing mutual information-based bounds, whereas a ’lossy compression’ scheme allows us to link generalization to the rate-distortion dimension - a particular notion of fractal dimension. Our results bring a more unified perspective on generalization and open up several future research directions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v178-sefidgaran22a, title = {Rate-Distortion Theoretic Generalization Bounds for Stochastic Learning Algorithms}, author = {Sefidgaran, Milad and Gohari, Amin and Richard, Ga\"el and Simsekli, Umut}, booktitle = {Proceedings of Thirty Fifth Conference on Learning Theory}, pages = {4416--4463}, year = {2022}, editor = {Loh, Po-Ling and Raginsky, Maxim}, volume = {178}, series = {Proceedings of Machine Learning Research}, month = {02--05 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v178/sefidgaran22a/sefidgaran22a.pdf}, url = {https://proceedings.mlr.press/v178/sefidgaran22a.html}, abstract = {Understanding generalization in modern machine learning settings has been one of the major challenges in statistical learning theory. In this context, recent years have witnessed the development of various generalization bounds suggesting different complexity notions such as the mutual information between the data sample and the algorithm output, compressibility of the hypothesis space, and the fractal dimension of the hypothesis space. While these bounds have illuminated the problem at hand from different angles, their suggested complexity notions might appear seemingly unrelated, thereby restricting their high-level impact. In this study, we prove novel generalization bounds through the lens of rate-distortion theory, and explicitly relate the concepts of mutual information, compressibility, and fractal dimensions in a single mathematical framework. Our approach consists of (i) defining a generalized notion of compressibility by using source coding concepts, and (ii) showing that the ’compression error rate’ can be linked to the generalization error both in expectation and with high probability. We show that in the ’lossless compression’ setting, we recover and improve existing mutual information-based bounds, whereas a ’lossy compression’ scheme allows us to link generalization to the rate-distortion dimension - a particular notion of fractal dimension. Our results bring a more unified perspective on generalization and open up several future research directions.} }
Endnote
%0 Conference Paper %T Rate-Distortion Theoretic Generalization Bounds for Stochastic Learning Algorithms %A Milad Sefidgaran %A Amin Gohari %A Gaël Richard %A Umut Simsekli %B Proceedings of Thirty Fifth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2022 %E Po-Ling Loh %E Maxim Raginsky %F pmlr-v178-sefidgaran22a %I PMLR %P 4416--4463 %U https://proceedings.mlr.press/v178/sefidgaran22a.html %V 178 %X Understanding generalization in modern machine learning settings has been one of the major challenges in statistical learning theory. In this context, recent years have witnessed the development of various generalization bounds suggesting different complexity notions such as the mutual information between the data sample and the algorithm output, compressibility of the hypothesis space, and the fractal dimension of the hypothesis space. While these bounds have illuminated the problem at hand from different angles, their suggested complexity notions might appear seemingly unrelated, thereby restricting their high-level impact. In this study, we prove novel generalization bounds through the lens of rate-distortion theory, and explicitly relate the concepts of mutual information, compressibility, and fractal dimensions in a single mathematical framework. Our approach consists of (i) defining a generalized notion of compressibility by using source coding concepts, and (ii) showing that the ’compression error rate’ can be linked to the generalization error both in expectation and with high probability. We show that in the ’lossless compression’ setting, we recover and improve existing mutual information-based bounds, whereas a ’lossy compression’ scheme allows us to link generalization to the rate-distortion dimension - a particular notion of fractal dimension. Our results bring a more unified perspective on generalization and open up several future research directions.
APA
Sefidgaran, M., Gohari, A., Richard, G. & Simsekli, U.. (2022). Rate-Distortion Theoretic Generalization Bounds for Stochastic Learning Algorithms. Proceedings of Thirty Fifth Conference on Learning Theory, in Proceedings of Machine Learning Research 178:4416-4463 Available from https://proceedings.mlr.press/v178/sefidgaran22a.html.

Related Material