Learning Theory for Conditional Risk Minimization

[edit]

Alexander Zimin, Christoph Lampert ;
Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, PMLR 54:213-222, 2017.

Abstract

In this work we study the learnability of stochastic processes with respect to the conditional risk, i.e. the existence of a learning algorithm that improves its next-step performance with the amount of observed data. We introduce a notion of pairwise discrepancy between conditional distributions at different times steps and show how certain properties of these discrepancies can be used to construct a successful learning algorithm. Our main results are two theorems that establish criteria for learnability for many classes of stochastic processes, including all special cases studied previously in the literature.

Related Material