Generalization Bound for Infinitely Divisible Empirical Process

Chao Zhang, Dacheng Tao
Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, PMLR 15:864-872, 2011.

Abstract

In this paper, we study the generalization bound for an empirical process of samples independently drawn from an infinitely divisible (ID) distribution, which is termed as the ID empirical process. In particular, based on a martingale method, we develop deviation inequalities for the sequence of random variables of an ID distribution. By applying the obtained deviation inequalities, we then show the generalization bound for ID empirical process based on the annealed Vapnik- Chervonenkis (VC) entropy. Afterward, according to Sauer's lemma, we get the generalization bound for ID empirical process based on the VC dimension. Finally, by using a resulted result bound, we analyze the asymptotic convergence of ID empirical process and show that the convergence rate of ID empirical process can reach $O\left(\left(\frac{\Lambda_\mathcal{F}(2N)}{N}\right)^\frac{1}{1.3}\right)$ and it is faster than the results of the generic i.i.d. empirical process (Vapnik, 1999).

Cite this Paper


BibTeX
@InProceedings{pmlr-v15-zhang11b, title = {Generalization Bound for Infinitely Divisible Empirical Process}, author = {Zhang, Chao and Tao, Dacheng}, booktitle = {Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics}, pages = {864--872}, year = {2011}, editor = {Gordon, Geoffrey and Dunson, David and Dudík, Miroslav}, volume = {15}, series = {Proceedings of Machine Learning Research}, address = {Fort Lauderdale, FL, USA}, month = {11--13 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v15/zhang11b/zhang11b.pdf}, url = {https://proceedings.mlr.press/v15/zhang11b.html}, abstract = {In this paper, we study the generalization bound for an empirical process of samples independently drawn from an infinitely divisible (ID) distribution, which is termed as the ID empirical process. In particular, based on a martingale method, we develop deviation inequalities for the sequence of random variables of an ID distribution. By applying the obtained deviation inequalities, we then show the generalization bound for ID empirical process based on the annealed Vapnik- Chervonenkis (VC) entropy. Afterward, according to Sauer's lemma, we get the generalization bound for ID empirical process based on the VC dimension. Finally, by using a resulted result bound, we analyze the asymptotic convergence of ID empirical process and show that the convergence rate of ID empirical process can reach $O\left(\left(\frac{\Lambda_\mathcal{F}(2N)}{N}\right)^\frac{1}{1.3}\right)$ and it is faster than the results of the generic i.i.d. empirical process (Vapnik, 1999).} }
Endnote
%0 Conference Paper %T Generalization Bound for Infinitely Divisible Empirical Process %A Chao Zhang %A Dacheng Tao %B Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2011 %E Geoffrey Gordon %E David Dunson %E Miroslav Dudík %F pmlr-v15-zhang11b %I PMLR %P 864--872 %U https://proceedings.mlr.press/v15/zhang11b.html %V 15 %X In this paper, we study the generalization bound for an empirical process of samples independently drawn from an infinitely divisible (ID) distribution, which is termed as the ID empirical process. In particular, based on a martingale method, we develop deviation inequalities for the sequence of random variables of an ID distribution. By applying the obtained deviation inequalities, we then show the generalization bound for ID empirical process based on the annealed Vapnik- Chervonenkis (VC) entropy. Afterward, according to Sauer's lemma, we get the generalization bound for ID empirical process based on the VC dimension. Finally, by using a resulted result bound, we analyze the asymptotic convergence of ID empirical process and show that the convergence rate of ID empirical process can reach $O\left(\left(\frac{\Lambda_\mathcal{F}(2N)}{N}\right)^\frac{1}{1.3}\right)$ and it is faster than the results of the generic i.i.d. empirical process (Vapnik, 1999).
RIS
TY - CPAPER TI - Generalization Bound for Infinitely Divisible Empirical Process AU - Chao Zhang AU - Dacheng Tao BT - Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics DA - 2011/06/14 ED - Geoffrey Gordon ED - David Dunson ED - Miroslav Dudík ID - pmlr-v15-zhang11b PB - PMLR DP - Proceedings of Machine Learning Research VL - 15 SP - 864 EP - 872 L1 - http://proceedings.mlr.press/v15/zhang11b/zhang11b.pdf UR - https://proceedings.mlr.press/v15/zhang11b.html AB - In this paper, we study the generalization bound for an empirical process of samples independently drawn from an infinitely divisible (ID) distribution, which is termed as the ID empirical process. In particular, based on a martingale method, we develop deviation inequalities for the sequence of random variables of an ID distribution. By applying the obtained deviation inequalities, we then show the generalization bound for ID empirical process based on the annealed Vapnik- Chervonenkis (VC) entropy. Afterward, according to Sauer's lemma, we get the generalization bound for ID empirical process based on the VC dimension. Finally, by using a resulted result bound, we analyze the asymptotic convergence of ID empirical process and show that the convergence rate of ID empirical process can reach $O\left(\left(\frac{\Lambda_\mathcal{F}(2N)}{N}\right)^\frac{1}{1.3}\right)$ and it is faster than the results of the generic i.i.d. empirical process (Vapnik, 1999). ER -
APA
Zhang, C. & Tao, D.. (2011). Generalization Bound for Infinitely Divisible Empirical Process. Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 15:864-872 Available from https://proceedings.mlr.press/v15/zhang11b.html.

Related Material