PAC-Bayesian Bounds based on the Rényi Divergence

Luc Bégin, Pascal Germain, François Laviolette, Jean-Francis Roy
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:435-444, 2016.

Abstract

We propose a simplified proof process for PAC-Bayesian generalization bounds, that allows to divide the proof in four successive inequalities, easing the "customization" of PAC-Bayesian theorems. We also propose a family of PAC-Bayesian bounds based on the Rényi divergence between the prior and posterior distributions, whereas most PAC-Bayesian bounds are based on the Kullback-Leibler divergence. Finally, we present an empirical evaluation of the tightness of each inequality of the simplified proof, for both the classical PAC-Bayesian bounds and those based on the Rényi divergence.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-begin16, title = {PAC-Bayesian Bounds based on the Rényi Divergence}, author = {Bégin, Luc and Germain, Pascal and Laviolette, François and Roy, Jean-Francis}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {435--444}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/begin16.pdf}, url = {https://proceedings.mlr.press/v51/begin16.html}, abstract = {We propose a simplified proof process for PAC-Bayesian generalization bounds, that allows to divide the proof in four successive inequalities, easing the "customization" of PAC-Bayesian theorems. We also propose a family of PAC-Bayesian bounds based on the Rényi divergence between the prior and posterior distributions, whereas most PAC-Bayesian bounds are based on the Kullback-Leibler divergence. Finally, we present an empirical evaluation of the tightness of each inequality of the simplified proof, for both the classical PAC-Bayesian bounds and those based on the Rényi divergence.} }
Endnote
%0 Conference Paper %T PAC-Bayesian Bounds based on the Rényi Divergence %A Luc Bégin %A Pascal Germain %A François Laviolette %A Jean-Francis Roy %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-begin16 %I PMLR %P 435--444 %U https://proceedings.mlr.press/v51/begin16.html %V 51 %X We propose a simplified proof process for PAC-Bayesian generalization bounds, that allows to divide the proof in four successive inequalities, easing the "customization" of PAC-Bayesian theorems. We also propose a family of PAC-Bayesian bounds based on the Rényi divergence between the prior and posterior distributions, whereas most PAC-Bayesian bounds are based on the Kullback-Leibler divergence. Finally, we present an empirical evaluation of the tightness of each inequality of the simplified proof, for both the classical PAC-Bayesian bounds and those based on the Rényi divergence.
RIS
TY - CPAPER TI - PAC-Bayesian Bounds based on the Rényi Divergence AU - Luc Bégin AU - Pascal Germain AU - François Laviolette AU - Jean-Francis Roy BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-begin16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 435 EP - 444 L1 - http://proceedings.mlr.press/v51/begin16.pdf UR - https://proceedings.mlr.press/v51/begin16.html AB - We propose a simplified proof process for PAC-Bayesian generalization bounds, that allows to divide the proof in four successive inequalities, easing the "customization" of PAC-Bayesian theorems. We also propose a family of PAC-Bayesian bounds based on the Rényi divergence between the prior and posterior distributions, whereas most PAC-Bayesian bounds are based on the Kullback-Leibler divergence. Finally, we present an empirical evaluation of the tightness of each inequality of the simplified proof, for both the classical PAC-Bayesian bounds and those based on the Rényi divergence. ER -
APA
Bégin, L., Germain, P., Laviolette, F. & Roy, J.. (2016). PAC-Bayesian Bounds based on the Rényi Divergence. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:435-444 Available from https://proceedings.mlr.press/v51/begin16.html.

Related Material