Plugin Estimators for Conditional Expectations and Probabilities
[edit]
Proceedings of the TwentyFirst International Conference on Artificial Intelligence and Statistics, PMLR 84:15131521, 2018.
Abstract
We study plugin estimators of conditional expectations and probabilities, and we provide a systematic analysis of their rates of convergence. The plugin approach is particularly useful in this setting since it introduces a natural link to VC and empirical process theory. We make use of this link to derive rates of convergence that hold uniformly over large classes of functions and sets, and under various conditions. For instance, we demonstrate that elementary conditional probabilities are estimated by these plugin estimators with a rate of $n^{α1/2}$ if one conditions with a VCclass of sets and where $α∈[0,1/2)$ controls a lower bound on the size of sets we can estimate given n samples. We gain similar results for Kolmogorov’s conditional expectation and probability which generalize the elementary forms of conditioning. Due to their simplicity, plugin estimators can be evaluated in linear time and there is no upfront cost for inference.
Related Material


