[edit]
Generalization Bound for Infinitely Divisible Empirical Process
Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, PMLR 15:864-872, 2011.
Abstract
In this paper, we study the generalization bound for an empirical process of samples independently drawn from an infinitely divisible (ID) distribution, which is termed as the ID empirical process. In particular, based on a martingale method, we develop deviation inequalities for the sequence of random variables of an ID distribution. By applying the obtained deviation inequalities, we then show the generalization bound for ID empirical process based on the annealed Vapnik- Chervonenkis (VC) entropy. Afterward, according to Sauer's lemma, we get the generalization bound for ID empirical process based on the VC dimension. Finally, by using a resulted result bound, we analyze the asymptotic convergence of ID empirical process and show that the convergence rate of ID empirical process can reach $O\left(\left(\frac{\Lambda_\mathcal{F}(2N)}{N}\right)^\frac{1}{1.3}\right)$ and it is faster than the results of the generic i.i.d. empirical process (Vapnik, 1999).