[edit]
Rate-Distortion Theoretic Generalization Bounds for Stochastic Learning Algorithms
Proceedings of Thirty Fifth Conference on Learning Theory, PMLR 178:4416-4463, 2022.
Abstract
Understanding generalization in modern machine learning settings has been one of the major challenges in statistical learning theory. In this context, recent years have witnessed the development of various generalization bounds suggesting different complexity notions such as the mutual information between the data sample and the algorithm output, compressibility of the hypothesis space, and the fractal dimension of the hypothesis space. While these bounds have illuminated the problem at hand from different angles, their suggested complexity notions might appear seemingly unrelated, thereby restricting their high-level impact. In this study, we prove novel generalization bounds through the lens of rate-distortion theory, and explicitly relate the concepts of mutual information, compressibility, and fractal dimensions in a single mathematical framework. Our approach consists of (i) defining a generalized notion of compressibility by using source coding concepts, and (ii) showing that the ’compression error rate’ can be linked to the generalization error both in expectation and with high probability. We show that in the ’lossless compression’ setting, we recover and improve existing mutual information-based bounds, whereas a ’lossy compression’ scheme allows us to link generalization to the rate-distortion dimension - a particular notion of fractal dimension. Our results bring a more unified perspective on generalization and open up several future research directions.