Entropy-Based Concentration Inequalities for Dependent Variables

Liva Ralaivola, Massih-Reza Amini
; Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:2436-2444, 2015.

Abstract

We provide new concentration inequalities for functions of dependent variables. The work extends that of Janson (2004), which proposes concentration inequalities using a combination of the Laplace transform and the idea of fractional graph coloring, as well as many works that derive concentration inequalities using the entropy method (see, e.g., (Boucheron et al., 2003)). We give inequalities for fractionally sub-additive and fractionally self-bounding functions. In the way, we prove a new Talagrand concentration inequality for fractionally sub-additive functions of dependent variables. The results allow us to envision the derivation of generalization bounds for various applications where dependent variables naturally appear, such as in bipartite ranking.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-ralaivola15, title = {Entropy-Based Concentration Inequalities for Dependent Variables}, author = {Liva Ralaivola and Massih-Reza Amini}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {2436--2444}, year = {2015}, editor = {Francis Bach and David Blei}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/ralaivola15.pdf}, url = {http://proceedings.mlr.press/v37/ralaivola15.html}, abstract = {We provide new concentration inequalities for functions of dependent variables. The work extends that of Janson (2004), which proposes concentration inequalities using a combination of the Laplace transform and the idea of fractional graph coloring, as well as many works that derive concentration inequalities using the entropy method (see, e.g., (Boucheron et al., 2003)). We give inequalities for fractionally sub-additive and fractionally self-bounding functions. In the way, we prove a new Talagrand concentration inequality for fractionally sub-additive functions of dependent variables. The results allow us to envision the derivation of generalization bounds for various applications where dependent variables naturally appear, such as in bipartite ranking.} }
Endnote
%0 Conference Paper %T Entropy-Based Concentration Inequalities for Dependent Variables %A Liva Ralaivola %A Massih-Reza Amini %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-ralaivola15 %I PMLR %J Proceedings of Machine Learning Research %P 2436--2444 %U http://proceedings.mlr.press %V 37 %W PMLR %X We provide new concentration inequalities for functions of dependent variables. The work extends that of Janson (2004), which proposes concentration inequalities using a combination of the Laplace transform and the idea of fractional graph coloring, as well as many works that derive concentration inequalities using the entropy method (see, e.g., (Boucheron et al., 2003)). We give inequalities for fractionally sub-additive and fractionally self-bounding functions. In the way, we prove a new Talagrand concentration inequality for fractionally sub-additive functions of dependent variables. The results allow us to envision the derivation of generalization bounds for various applications where dependent variables naturally appear, such as in bipartite ranking.
RIS
TY - CPAPER TI - Entropy-Based Concentration Inequalities for Dependent Variables AU - Liva Ralaivola AU - Massih-Reza Amini BT - Proceedings of the 32nd International Conference on Machine Learning PY - 2015/06/01 DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-ralaivola15 PB - PMLR SP - 2436 DP - PMLR EP - 2444 L1 - http://proceedings.mlr.press/v37/ralaivola15.pdf UR - http://proceedings.mlr.press/v37/ralaivola15.html AB - We provide new concentration inequalities for functions of dependent variables. The work extends that of Janson (2004), which proposes concentration inequalities using a combination of the Laplace transform and the idea of fractional graph coloring, as well as many works that derive concentration inequalities using the entropy method (see, e.g., (Boucheron et al., 2003)). We give inequalities for fractionally sub-additive and fractionally self-bounding functions. In the way, we prove a new Talagrand concentration inequality for fractionally sub-additive functions of dependent variables. The results allow us to envision the derivation of generalization bounds for various applications where dependent variables naturally appear, such as in bipartite ranking. ER -
APA
Ralaivola, L. & Amini, M.. (2015). Entropy-Based Concentration Inequalities for Dependent Variables. Proceedings of the 32nd International Conference on Machine Learning, in PMLR 37:2436-2444

Related Material