Concentration Inequalities for General Functions of Heavy-Tailed Random Variables

Shaojie Li, Yong Liu
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:28343-28367, 2024.

Abstract

Concentration inequalities play an essential role in the study of machine learning and high dimensional statistics. In this paper, we obtain unbounded analogues of the popular bounded difference inequality for functions of independent random variables with heavy-tailed distributions. The main results provide a general framework applicable to all heavy-tailed distributions with finite variance. To illustrate the strength of our results, we present applications to sub-exponential tails, sub-Weibull tails, and heavier polynomially decaying tails. Applied to some standard problems in statistical learning theory (vector valued concentration, Rademacher complexity, and algorithmic stability), we show that these inequalities allow an extension of existing results to heavy-tailed distributions up to finite variance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-li24au, title = {Concentration Inequalities for General Functions of Heavy-Tailed Random Variables}, author = {Li, Shaojie and Liu, Yong}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {28343--28367}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/li24au/li24au.pdf}, url = {https://proceedings.mlr.press/v235/li24au.html}, abstract = {Concentration inequalities play an essential role in the study of machine learning and high dimensional statistics. In this paper, we obtain unbounded analogues of the popular bounded difference inequality for functions of independent random variables with heavy-tailed distributions. The main results provide a general framework applicable to all heavy-tailed distributions with finite variance. To illustrate the strength of our results, we present applications to sub-exponential tails, sub-Weibull tails, and heavier polynomially decaying tails. Applied to some standard problems in statistical learning theory (vector valued concentration, Rademacher complexity, and algorithmic stability), we show that these inequalities allow an extension of existing results to heavy-tailed distributions up to finite variance.} }
Endnote
%0 Conference Paper %T Concentration Inequalities for General Functions of Heavy-Tailed Random Variables %A Shaojie Li %A Yong Liu %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-li24au %I PMLR %P 28343--28367 %U https://proceedings.mlr.press/v235/li24au.html %V 235 %X Concentration inequalities play an essential role in the study of machine learning and high dimensional statistics. In this paper, we obtain unbounded analogues of the popular bounded difference inequality for functions of independent random variables with heavy-tailed distributions. The main results provide a general framework applicable to all heavy-tailed distributions with finite variance. To illustrate the strength of our results, we present applications to sub-exponential tails, sub-Weibull tails, and heavier polynomially decaying tails. Applied to some standard problems in statistical learning theory (vector valued concentration, Rademacher complexity, and algorithmic stability), we show that these inequalities allow an extension of existing results to heavy-tailed distributions up to finite variance.
APA
Li, S. & Liu, Y.. (2024). Concentration Inequalities for General Functions of Heavy-Tailed Random Variables. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:28343-28367 Available from https://proceedings.mlr.press/v235/li24au.html.

Related Material