[edit]
Plug-in Estimators for Conditional Expectations and Probabilities
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1513-1521, 2018.
Abstract
We study plug-in estimators of conditional expectations and probabilities, and we provide a systematic analysis of their rates of convergence. The plug-in approach is particularly useful in this setting since it introduces a natural link to VC- and empirical process theory. We make use of this link to derive rates of convergence that hold uniformly over large classes of functions and sets, and under various conditions. For instance, we demonstrate that elementary conditional probabilities are estimated by these plug-in estimators with a rate of $n^{α-1/2}$ if one conditions with a VC-class of sets and where $α∈[0,1/2)$ controls a lower bound on the size of sets we can estimate given n samples. We gain similar results for Kolmogorov’s conditional expectation and probability which generalize the elementary forms of conditioning. Due to their simplicity, plug-in estimators can be evaluated in linear time and there is no up-front cost for inference.