[edit]

# Tighter PAC-Bayes Generalisation Bounds by Leveraging Example Difficulty

*Proceedings of The 26th International Conference on Artificial Intelligence and Statistics*, PMLR 206:8165-8182, 2023.

#### Abstract

We introduce a modified version of the excess risk, which can be used to obtain empirically tighter, faster-rate PAC-Bayesian generalisation bounds. This modified excess risk leverages information about the relative hardness of data examples to reduce the variance of its empirical counterpart, tightening the bound. We combine this with a new bound for [$-$1, 1]-valued (and potentially non-independent) signed losses, which is more favourable when they empirically have low variance around 0. The primary new technical tool is a novel result for sequences of interdependent random vectors which may be of independent interest. We empirically evaluate these new bounds on a number of real-world datasets.