Entropy-Based Concentration Inequalities for Dependent Variables

[edit]

Liva Ralaivola, Massih-Reza Amini ;
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:2436-2444, 2015.

Abstract

We provide new concentration inequalities for functions of dependent variables. The work extends that of Janson (2004), which proposes concentration inequalities using a combination of the Laplace transform and the idea of fractional graph coloring, as well as many works that derive concentration inequalities using the entropy method (see, e.g., (Boucheron et al., 2003)). We give inequalities for fractionally sub-additive and fractionally self-bounding functions. In the way, we prove a new Talagrand concentration inequality for fractionally sub-additive functions of dependent variables. The results allow us to envision the derivation of generalization bounds for various applications where dependent variables naturally appear, such as in bipartite ranking.

Related Material