[edit]
Minimax Regret on Patterns Using Kullback-Leibler Divergence Covering
Proceedings of Thirty Fifth Conference on Learning Theory, PMLR 178:3095-3112, 2022.
Abstract
This paper considers the problem of finding a tighter upper bound on the minimax regret of patterns, a class used to study large-alphabet distributions which avoids infinite asymptotic regret and redundancy. Our method for finding upper bounds for minimax regret uses cover numbers with Kullback-Leibler (KL) divergence as the distance. Compared to existing results by Acharya et al. (2013), we are able to improve the power of the exponent on the logarithmic term, giving a minimax regret bound which matches the best known minimax redundancy bound on patterns.