Vector-valued self-normalized concentration inequalities beyond sub-Gaussianity

Diego Martinez-Taboada, Tomás González, Aaditya Ramdas
Proceedings of The 37th International Conference on Algorithmic Learning Theory, PMLR 313:1-31, 2026.

Abstract

The study of self-normalized processes plays a crucial role in a wide range of applications, from sequential decision-making to econometrics. While the behavior of self-normalized concentration has been widely investigated for scalar-valued processes, vector-valued processes remain comparatively underexplored, especially outside of the sub-Gaussian framework. In this contribution, we provide concentration inequalities for self-normalized processes with light tails beyond sub-Gaussianity, including Bernstein-type, Bennett-type, and empirical Bennett-type inequalities. We illustrate the relevance of our results in the context of online linear regression, with applications in (kernelized) linear bandits.

Cite this Paper


BibTeX
@InProceedings{pmlr-v313-martinez-taboada26a, title = {Vector-valued self-normalized concentration inequalities beyond sub-Gaussianity}, author = {Martinez-Taboada, Diego and Gonz{\'a}lez, Tom{\'a}s and Ramdas, Aaditya}, booktitle = {Proceedings of The 37th International Conference on Algorithmic Learning Theory}, pages = {1--31}, year = {2026}, editor = {Telgarsky, Matus and Ullman, Jonathan}, volume = {313}, series = {Proceedings of Machine Learning Research}, month = {23--26 Feb}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v313/main/assets/martinez-taboada26a/martinez-taboada26a.pdf}, url = {https://proceedings.mlr.press/v313/martinez-taboada26a.html}, abstract = {The study of self-normalized processes plays a crucial role in a wide range of applications, from sequential decision-making to econometrics. While the behavior of self-normalized concentration has been widely investigated for scalar-valued processes, vector-valued processes remain comparatively underexplored, especially outside of the sub-Gaussian framework. In this contribution, we provide concentration inequalities for self-normalized processes with light tails beyond sub-Gaussianity, including Bernstein-type, Bennett-type, and empirical Bennett-type inequalities. We illustrate the relevance of our results in the context of online linear regression, with applications in (kernelized) linear bandits.} }
Endnote
%0 Conference Paper %T Vector-valued self-normalized concentration inequalities beyond sub-Gaussianity %A Diego Martinez-Taboada %A Tomás González %A Aaditya Ramdas %B Proceedings of The 37th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2026 %E Matus Telgarsky %E Jonathan Ullman %F pmlr-v313-martinez-taboada26a %I PMLR %P 1--31 %U https://proceedings.mlr.press/v313/martinez-taboada26a.html %V 313 %X The study of self-normalized processes plays a crucial role in a wide range of applications, from sequential decision-making to econometrics. While the behavior of self-normalized concentration has been widely investigated for scalar-valued processes, vector-valued processes remain comparatively underexplored, especially outside of the sub-Gaussian framework. In this contribution, we provide concentration inequalities for self-normalized processes with light tails beyond sub-Gaussianity, including Bernstein-type, Bennett-type, and empirical Bennett-type inequalities. We illustrate the relevance of our results in the context of online linear regression, with applications in (kernelized) linear bandits.
APA
Martinez-Taboada, D., González, T. & Ramdas, A.. (2026). Vector-valued self-normalized concentration inequalities beyond sub-Gaussianity. Proceedings of The 37th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 313:1-31 Available from https://proceedings.mlr.press/v313/martinez-taboada26a.html.

Related Material