Minimax Regret on Patterns Using Kullback-Leibler Divergence Covering

Jennifer Tang
Proceedings of Thirty Fifth Conference on Learning Theory, PMLR 178:3095-3112, 2022.

Abstract

This paper considers the problem of finding a tighter upper bound on the minimax regret of patterns, a class used to study large-alphabet distributions which avoids infinite asymptotic regret and redundancy. Our method for finding upper bounds for minimax regret uses cover numbers with Kullback-Leibler (KL) divergence as the distance. Compared to existing results by Acharya et al. (2013), we are able to improve the power of the exponent on the logarithmic term, giving a minimax regret bound which matches the best known minimax redundancy bound on patterns.

Cite this Paper


BibTeX
@InProceedings{pmlr-v178-tang22a, title = {Minimax Regret on Patterns Using Kullback-Leibler Divergence Covering}, author = {Tang, Jennifer}, booktitle = {Proceedings of Thirty Fifth Conference on Learning Theory}, pages = {3095--3112}, year = {2022}, editor = {Loh, Po-Ling and Raginsky, Maxim}, volume = {178}, series = {Proceedings of Machine Learning Research}, month = {02--05 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v178/tang22a/tang22a.pdf}, url = {https://proceedings.mlr.press/v178/tang22a.html}, abstract = {This paper considers the problem of finding a tighter upper bound on the minimax regret of patterns, a class used to study large-alphabet distributions which avoids infinite asymptotic regret and redundancy. Our method for finding upper bounds for minimax regret uses cover numbers with Kullback-Leibler (KL) divergence as the distance. Compared to existing results by Acharya et al. (2013), we are able to improve the power of the exponent on the logarithmic term, giving a minimax regret bound which matches the best known minimax redundancy bound on patterns.} }
Endnote
%0 Conference Paper %T Minimax Regret on Patterns Using Kullback-Leibler Divergence Covering %A Jennifer Tang %B Proceedings of Thirty Fifth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2022 %E Po-Ling Loh %E Maxim Raginsky %F pmlr-v178-tang22a %I PMLR %P 3095--3112 %U https://proceedings.mlr.press/v178/tang22a.html %V 178 %X This paper considers the problem of finding a tighter upper bound on the minimax regret of patterns, a class used to study large-alphabet distributions which avoids infinite asymptotic regret and redundancy. Our method for finding upper bounds for minimax regret uses cover numbers with Kullback-Leibler (KL) divergence as the distance. Compared to existing results by Acharya et al. (2013), we are able to improve the power of the exponent on the logarithmic term, giving a minimax regret bound which matches the best known minimax redundancy bound on patterns.
APA
Tang, J.. (2022). Minimax Regret on Patterns Using Kullback-Leibler Divergence Covering. Proceedings of Thirty Fifth Conference on Learning Theory, in Proceedings of Machine Learning Research 178:3095-3112 Available from https://proceedings.mlr.press/v178/tang22a.html.

Related Material