Skip to Main content Skip to Navigation
New interface
Conference papers

Rate-Distortion Theoretic Generalization Bounds for Stochastic Learning Algorithms

Milad Sefidgaran 1 Amin Gohari 2 Gael Richard 3, 4, 1 Umut Şimşekli 5, 3, 4, 1 
3 S2A - Signal, Statistique et Apprentissage
LTCI - Laboratoire Traitement et Communication de l'Information
5 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique - ENS Paris, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : Understanding generalization in modern machine learning settings has been one of the major challenges in statistical learning theory. In this context, recent years have witnessed the development of various generalization bounds suggesting different complexity notions such as the mutual information between the data sample and the algorithm output, compressibility of the hypothesis space, and the fractal dimension of the hypothesis space. While these bounds have illuminated the problem at hand from different angles, their suggested complexity notions might appear seemingly unrelated, thereby restricting their high-level impact. In this study, we prove novel generalization bounds through the lens of rate-distortion theory, and explicitly relate the concepts of mutual information, compressibility, and fractal dimensions in a single mathematical framework. Our approach consists of (i) defining a generalized notion of compressibility by using source coding concepts, and (ii) showing that the 'compression error rate' can be linked to the generalization error both in expectation and with high probability. We show that in the 'lossless compression' setting, we recover and improve existing mutual information-based bounds, whereas a 'lossy compression' scheme allows us to link generalization to the rate-distortion dimension-a particular notion of fractal dimension. Our results bring a more unified perspective on generalization and open up several future research directions.
Document type :
Conference papers
Complete list of metadata

https://hal.telecom-paris.fr/hal-03759597
Contributor : Gaël RICHARD Connect in order to contact the contributor
Submitted on : Wednesday, August 24, 2022 - 11:41:57 AM
Last modification on : Tuesday, August 30, 2022 - 3:41:08 AM
Long-term archiving on: : Friday, November 25, 2022 - 7:45:15 PM

File

sefidgaran22a.pdf
Publisher files allowed on an open archive

Identifiers

  • HAL Id : hal-03759597, version 1

Citation

Milad Sefidgaran, Amin Gohari, Gael Richard, Umut Şimşekli. Rate-Distortion Theoretic Generalization Bounds for Stochastic Learning Algorithms. COLT 2022 - 35th Annual Conference on Learning Theory, Jul 2022, London, United Kingdom. ⟨hal-03759597⟩

Share

Metrics

Record views

47

Files downloads

13