PAC-Bayesian Bounds based on the Rényi Divergence

[edit]

Luc Bégin, Pascal Germain, François Laviolette, Jean-Francis Roy ;
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:435-444, 2016.

Abstract

We propose a simplified proof process for PAC-Bayesian generalization bounds, that allows to divide the proof in four successive inequalities, easing the "customization" of PAC-Bayesian theorems. We also propose a family of PAC-Bayesian bounds based on the Rényi divergence between the prior and posterior distributions, whereas most PAC-Bayesian bounds are based on the Kullback-Leibler divergence. Finally, we present an empirical evaluation of the tightness of each inequality of the simplified proof, for both the classical PAC-Bayesian bounds and those based on the Rényi divergence.

Related Material