Data-dependent and Oracle Bounds on Forgetting in Continual Learning

Lior Friedman, Ron Meir
Proceedings of The 4th Conference on Lifelong Learning Agents, PMLR 330:713-735, 2026.

Abstract

In continual learning, knowledge must be preserved and re-used between tasks, maintaining good transfer to future tasks and minimizing forgetting of previously learned ones. While several practical algorithms have been devised for this setting, there have been few theoretical works aiming to quantify and bound the degree of Forgetting in general settings. For *exemplar-free* methods, we provide both data-dependent upper bounds that apply *regardless of model and algorithm choice*, and oracle bounds for Gibbs posteriors. We derive an algorithm based on our bounds and demonstrate empirically that our approach yields tight and practical bounds on forgetting for several continual learning problems and algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v330-friedman26a, title = {Data-dependent and Oracle Bounds on Forgetting in Continual Learning}, author = {Friedman, Lior and Meir, Ron}, booktitle = {Proceedings of The 4th Conference on Lifelong Learning Agents}, pages = {713--735}, year = {2026}, editor = {Chandar, Sarath and Pascanu, Razvan and Eaton, Eric and Liu, Bing and Mahmood, Rupam and Rannen-Triki, Amal}, volume = {330}, series = {Proceedings of Machine Learning Research}, month = {11--14 Aug}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v330/main/assets/friedman26a/friedman26a.pdf}, url = {https://proceedings.mlr.press/v330/friedman26a.html}, abstract = {In continual learning, knowledge must be preserved and re-used between tasks, maintaining good transfer to future tasks and minimizing forgetting of previously learned ones. While several practical algorithms have been devised for this setting, there have been few theoretical works aiming to quantify and bound the degree of Forgetting in general settings. For *exemplar-free* methods, we provide both data-dependent upper bounds that apply *regardless of model and algorithm choice*, and oracle bounds for Gibbs posteriors. We derive an algorithm based on our bounds and demonstrate empirically that our approach yields tight and practical bounds on forgetting for several continual learning problems and algorithms.} }
Endnote
%0 Conference Paper %T Data-dependent and Oracle Bounds on Forgetting in Continual Learning %A Lior Friedman %A Ron Meir %B Proceedings of The 4th Conference on Lifelong Learning Agents %C Proceedings of Machine Learning Research %D 2026 %E Sarath Chandar %E Razvan Pascanu %E Eric Eaton %E Bing Liu %E Rupam Mahmood %E Amal Rannen-Triki %F pmlr-v330-friedman26a %I PMLR %P 713--735 %U https://proceedings.mlr.press/v330/friedman26a.html %V 330 %X In continual learning, knowledge must be preserved and re-used between tasks, maintaining good transfer to future tasks and minimizing forgetting of previously learned ones. While several practical algorithms have been devised for this setting, there have been few theoretical works aiming to quantify and bound the degree of Forgetting in general settings. For *exemplar-free* methods, we provide both data-dependent upper bounds that apply *regardless of model and algorithm choice*, and oracle bounds for Gibbs posteriors. We derive an algorithm based on our bounds and demonstrate empirically that our approach yields tight and practical bounds on forgetting for several continual learning problems and algorithms.
APA
Friedman, L. & Meir, R.. (2026). Data-dependent and Oracle Bounds on Forgetting in Continual Learning. Proceedings of The 4th Conference on Lifelong Learning Agents, in Proceedings of Machine Learning Research 330:713-735 Available from https://proceedings.mlr.press/v330/friedman26a.html.

Related Material