Shifting Regret, Mirror Descent, and Matrices

[edit]

Andras Gyorgy, Csaba Szepesvari ;
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:2943-2951, 2016.

Abstract

We consider the problem of online prediction in changing environments. In this framework the performance of a predictor is evaluated as the loss relative to an arbitrarily changing predictor, whose individual components come from a base class of predictors. Typical results in the literature consider different base classes (experts, linear predictors on the simplex, etc.) separately. Introducing an arbitrary mapping inside the mirror decent algorithm, we provide a framework that unifies and extends existing results. As an example, we prove new shifting regret bounds for matrix prediction problems.

Related Material