Learning from Weakly Dependent Data under Dobrushin’s Condition

Yuval Dagan, Constantinos Daskalakis, Nishanth Dikkala, Siddhartha Jayanti
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:914-928, 2019.

Abstract

Statistical learning theory has largely focused on learning and generalization given independent and identically distributed (i.i.d.) samples. Motivated by applications involving time-series data, there has been a growing literature on learning and generalization in settings where data is sampled from an ergodic process. This work has also developed complexity measures, which appropriately extend the notion of Rademacher complexity to bound the generalization error and learning rates of hypothesis classes in this setting. Rather than time-series data, our work is motivated by settings where data is sampled on a network or a spatial domain, and thus do not fit well within the framework of prior work. We provide learning and generalization bounds for data that are complexly dependent, yet their distribution satisfies the standard Dobrushin’s condition. Indeed, we show that the standard complexity measures of Gaussian and Rademacher complexities and VC dimension are sufficient measures of complexity for the purposes of bounding the generalization error and learning rates of hypothesis classes in our setting. Moreover, our generalization bounds only degrade by constant factors compared to their i.i.d. analogs, and our learnability bounds degrade by log factors in the size of the training set.

Cite this Paper


BibTeX
@InProceedings{pmlr-v99-dagan19a, title = {Learning from Weakly Dependent Data under Dobrushin’s Condition}, author = {Dagan, Yuval and Daskalakis, Constantinos and Dikkala, Nishanth and Jayanti, Siddhartha}, booktitle = {Proceedings of the Thirty-Second Conference on Learning Theory}, pages = {914--928}, year = {2019}, editor = {Beygelzimer, Alina and Hsu, Daniel}, volume = {99}, series = {Proceedings of Machine Learning Research}, month = {25--28 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v99/dagan19a/dagan19a.pdf}, url = {https://proceedings.mlr.press/v99/dagan19a.html}, abstract = {Statistical learning theory has largely focused on learning and generalization given independent and identically distributed (i.i.d.) samples. Motivated by applications involving time-series data, there has been a growing literature on learning and generalization in settings where data is sampled from an ergodic process. This work has also developed complexity measures, which appropriately extend the notion of Rademacher complexity to bound the generalization error and learning rates of hypothesis classes in this setting. Rather than time-series data, our work is motivated by settings where data is sampled on a network or a spatial domain, and thus do not fit well within the framework of prior work. We provide learning and generalization bounds for data that are complexly dependent, yet their distribution satisfies the standard Dobrushin’s condition. Indeed, we show that the standard complexity measures of Gaussian and Rademacher complexities and VC dimension are sufficient measures of complexity for the purposes of bounding the generalization error and learning rates of hypothesis classes in our setting. Moreover, our generalization bounds only degrade by constant factors compared to their i.i.d. analogs, and our learnability bounds degrade by log factors in the size of the training set.} }
Endnote
%0 Conference Paper %T Learning from Weakly Dependent Data under Dobrushin’s Condition %A Yuval Dagan %A Constantinos Daskalakis %A Nishanth Dikkala %A Siddhartha Jayanti %B Proceedings of the Thirty-Second Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2019 %E Alina Beygelzimer %E Daniel Hsu %F pmlr-v99-dagan19a %I PMLR %P 914--928 %U https://proceedings.mlr.press/v99/dagan19a.html %V 99 %X Statistical learning theory has largely focused on learning and generalization given independent and identically distributed (i.i.d.) samples. Motivated by applications involving time-series data, there has been a growing literature on learning and generalization in settings where data is sampled from an ergodic process. This work has also developed complexity measures, which appropriately extend the notion of Rademacher complexity to bound the generalization error and learning rates of hypothesis classes in this setting. Rather than time-series data, our work is motivated by settings where data is sampled on a network or a spatial domain, and thus do not fit well within the framework of prior work. We provide learning and generalization bounds for data that are complexly dependent, yet their distribution satisfies the standard Dobrushin’s condition. Indeed, we show that the standard complexity measures of Gaussian and Rademacher complexities and VC dimension are sufficient measures of complexity for the purposes of bounding the generalization error and learning rates of hypothesis classes in our setting. Moreover, our generalization bounds only degrade by constant factors compared to their i.i.d. analogs, and our learnability bounds degrade by log factors in the size of the training set.
APA
Dagan, Y., Daskalakis, C., Dikkala, N. & Jayanti, S.. (2019). Learning from Weakly Dependent Data under Dobrushin’s Condition. Proceedings of the Thirty-Second Conference on Learning Theory, in Proceedings of Machine Learning Research 99:914-928 Available from https://proceedings.mlr.press/v99/dagan19a.html.

Related Material