[edit]
Complexity-Based Approach to Calibration with Checking Rules
Proceedings of the 24th Annual Conference on Learning Theory, PMLR 19:293-314, 2011.
Abstract
We consider the problem of forecasting a sequence of outcomes from an unknown source. The quality of the forecaster is measured by a family of checking rules. We prove upper bounds on the value of the associated game, thus certifying the existence of a calibrated strategy for the forecaster. We show that complexity of the family of checking rules can be captured by the notion of a sequential cover introduced in (Rakhlin et al., 2010a). Various natural assumptions on the class of checking rules are considered, including finiteness of Vapnik-Chervonenkis and Littlestone’s dimensions.