Sequential prediction with coded side information under logarithmic loss

[edit]

Yanina Shkel, Maxim Raginsky, Sergio Verdú ;
Proceedings of Algorithmic Learning Theory, PMLR 83:753-769, 2018.

Abstract

We study the problem of sequential prediction with coded side information under logarithmic loss (log-loss). We show an operational equivalence between this setup and lossy compression with log-loss distortion. Using this insight, together with recent work on lossy compression with log-loss, we connect prediction strategies with distributions in a certain subset of the probability simplex. This allows us to derive a Shtarkov-like bound for regret and to evaluate the regret for several illustrative classes of experts. In the present work, we mainly focus on the “batch” side information setting with sequential prediction.

Related Material