Leveraging PAC-Bayes Theory and Gibbs Distributions for Generalization Bounds with Complexity Measures

Paul Viallard, Rémi Emonet, Amaury Habrard, Emilie Morvant, Valentina Zantedeschi
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:3007-3015, 2024.

Abstract

In statistical learning theory, a generalization bound usually involves a complexity measure imposed by the considered theoretical framework. This limits the scope of such bounds, as other forms of capacity measures or regularizations are used in algorithms. In this paper, we leverage the framework of disintegrated PAC-Bayes bounds to derive a general generalization bound instantiable with arbitrary complexity measures. One trick to prove such a result involves considering a commonly used family of distributions: the Gibbs distributions. Our bound stands in probability jointly over the hypothesis and the learning sample, which allows the complexity to be adapted to the generalization gap as it can be customized to fit both the hypothesis class and the task.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-viallard24a, title = { Leveraging {PAC}-{B}ayes Theory and {G}ibbs Distributions for Generalization Bounds with Complexity Measures }, author = {Viallard, Paul and Emonet, R\'{e}mi and Habrard, Amaury and Morvant, Emilie and Zantedeschi, Valentina}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {3007--3015}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/viallard24a/viallard24a.pdf}, url = {https://proceedings.mlr.press/v238/viallard24a.html}, abstract = { In statistical learning theory, a generalization bound usually involves a complexity measure imposed by the considered theoretical framework. This limits the scope of such bounds, as other forms of capacity measures or regularizations are used in algorithms. In this paper, we leverage the framework of disintegrated PAC-Bayes bounds to derive a general generalization bound instantiable with arbitrary complexity measures. One trick to prove such a result involves considering a commonly used family of distributions: the Gibbs distributions. Our bound stands in probability jointly over the hypothesis and the learning sample, which allows the complexity to be adapted to the generalization gap as it can be customized to fit both the hypothesis class and the task. } }
Endnote
%0 Conference Paper %T Leveraging PAC-Bayes Theory and Gibbs Distributions for Generalization Bounds with Complexity Measures %A Paul Viallard %A Rémi Emonet %A Amaury Habrard %A Emilie Morvant %A Valentina Zantedeschi %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-viallard24a %I PMLR %P 3007--3015 %U https://proceedings.mlr.press/v238/viallard24a.html %V 238 %X In statistical learning theory, a generalization bound usually involves a complexity measure imposed by the considered theoretical framework. This limits the scope of such bounds, as other forms of capacity measures or regularizations are used in algorithms. In this paper, we leverage the framework of disintegrated PAC-Bayes bounds to derive a general generalization bound instantiable with arbitrary complexity measures. One trick to prove such a result involves considering a commonly used family of distributions: the Gibbs distributions. Our bound stands in probability jointly over the hypothesis and the learning sample, which allows the complexity to be adapted to the generalization gap as it can be customized to fit both the hypothesis class and the task.
APA
Viallard, P., Emonet, R., Habrard, A., Morvant, E. & Zantedeschi, V.. (2024). Leveraging PAC-Bayes Theory and Gibbs Distributions for Generalization Bounds with Complexity Measures . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:3007-3015 Available from https://proceedings.mlr.press/v238/viallard24a.html.

Related Material