Generalization Bounds for Dependent Data using Online-to-Batch Conversion.

Sagnik Chatterjee, MANUJ MUKHERJEE, Alhad Sethi
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:2152-2160, 2025.

Abstract

In this work, we give generalization bounds of statistical learning algorithms trained on samples drawn from a dependent data source both in expectation and with high probability, using the Online-to-Batch conversion paradigm. We show that the generalization error of statistical learners in the dependent data setting is equivalent to the generalization error of statistical learners in the i.i.d. setting up to a term that depends on the decay rate of the underlying mixing stochastic process, and is independent of the complexity of the statistical learner. Our proof techniques involve defining a new notion of stability of online learning algorithms based on Wasserstein distances, and employing ”near-martingale” concentration bounds for dependent random variables to arrive at appropriate upper bounds for the generalization error of statistical learners trained on dependent data. Finally, we prove that the Exponential Weighted Averages (EWA) algorithm satisfies our new notion of stability, and instantiate our bounds using the EWA algorithm.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-chatterjee25b, title = {Generalization Bounds for Dependent Data using Online-to-Batch Conversion.}, author = {Chatterjee, Sagnik and MUKHERJEE, MANUJ and Sethi, Alhad}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {2152--2160}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/chatterjee25b/chatterjee25b.pdf}, url = {https://proceedings.mlr.press/v258/chatterjee25b.html}, abstract = {In this work, we give generalization bounds of statistical learning algorithms trained on samples drawn from a dependent data source both in expectation and with high probability, using the Online-to-Batch conversion paradigm. We show that the generalization error of statistical learners in the dependent data setting is equivalent to the generalization error of statistical learners in the i.i.d. setting up to a term that depends on the decay rate of the underlying mixing stochastic process, and is independent of the complexity of the statistical learner. Our proof techniques involve defining a new notion of stability of online learning algorithms based on Wasserstein distances, and employing ”near-martingale” concentration bounds for dependent random variables to arrive at appropriate upper bounds for the generalization error of statistical learners trained on dependent data. Finally, we prove that the Exponential Weighted Averages (EWA) algorithm satisfies our new notion of stability, and instantiate our bounds using the EWA algorithm.} }
Endnote
%0 Conference Paper %T Generalization Bounds for Dependent Data using Online-to-Batch Conversion. %A Sagnik Chatterjee %A MANUJ MUKHERJEE %A Alhad Sethi %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-chatterjee25b %I PMLR %P 2152--2160 %U https://proceedings.mlr.press/v258/chatterjee25b.html %V 258 %X In this work, we give generalization bounds of statistical learning algorithms trained on samples drawn from a dependent data source both in expectation and with high probability, using the Online-to-Batch conversion paradigm. We show that the generalization error of statistical learners in the dependent data setting is equivalent to the generalization error of statistical learners in the i.i.d. setting up to a term that depends on the decay rate of the underlying mixing stochastic process, and is independent of the complexity of the statistical learner. Our proof techniques involve defining a new notion of stability of online learning algorithms based on Wasserstein distances, and employing ”near-martingale” concentration bounds for dependent random variables to arrive at appropriate upper bounds for the generalization error of statistical learners trained on dependent data. Finally, we prove that the Exponential Weighted Averages (EWA) algorithm satisfies our new notion of stability, and instantiate our bounds using the EWA algorithm.
APA
Chatterjee, S., MUKHERJEE, M. & Sethi, A.. (2025). Generalization Bounds for Dependent Data using Online-to-Batch Conversion.. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:2152-2160 Available from https://proceedings.mlr.press/v258/chatterjee25b.html.

Related Material