Predicting the Susceptibility of Examples to Catastrophic Forgetting

Guy Hacohen, Tinne Tuytelaars
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:21546-21569, 2025.

Abstract

Catastrophic forgetting – the tendency of neural networks to forget previously learned data when learning new information – remains a central challenge in continual learning. In this work, we adopt a behavioral approach, observing a connection between learning speed and forgetting: examples learned more quickly are less prone to forgetting. Focusing on replay-based continual learning, we show that the composition of the replay buffer – specifically, whether it contains quickly or slowly learned examples – has a significant effect on forgetting. Motivated by this insight, we introduce Speed-Based Sampling (SBS), a simple yet general strategy that selects replay examples based on their learning speed. SBS integrates easily into existing buffer-based methods and improves performance across a wide range of competitive continual learning benchmarks, advancing state-of-the-art results. Our findings underscore the value of accounting for the forgetting dynamics when designing continual learning algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-hacohen25a, title = {Predicting the Susceptibility of Examples to Catastrophic Forgetting}, author = {Hacohen, Guy and Tuytelaars, Tinne}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {21546--21569}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/hacohen25a/hacohen25a.pdf}, url = {https://proceedings.mlr.press/v267/hacohen25a.html}, abstract = {Catastrophic forgetting – the tendency of neural networks to forget previously learned data when learning new information – remains a central challenge in continual learning. In this work, we adopt a behavioral approach, observing a connection between learning speed and forgetting: examples learned more quickly are less prone to forgetting. Focusing on replay-based continual learning, we show that the composition of the replay buffer – specifically, whether it contains quickly or slowly learned examples – has a significant effect on forgetting. Motivated by this insight, we introduce Speed-Based Sampling (SBS), a simple yet general strategy that selects replay examples based on their learning speed. SBS integrates easily into existing buffer-based methods and improves performance across a wide range of competitive continual learning benchmarks, advancing state-of-the-art results. Our findings underscore the value of accounting for the forgetting dynamics when designing continual learning algorithms.} }
Endnote
%0 Conference Paper %T Predicting the Susceptibility of Examples to Catastrophic Forgetting %A Guy Hacohen %A Tinne Tuytelaars %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-hacohen25a %I PMLR %P 21546--21569 %U https://proceedings.mlr.press/v267/hacohen25a.html %V 267 %X Catastrophic forgetting – the tendency of neural networks to forget previously learned data when learning new information – remains a central challenge in continual learning. In this work, we adopt a behavioral approach, observing a connection between learning speed and forgetting: examples learned more quickly are less prone to forgetting. Focusing on replay-based continual learning, we show that the composition of the replay buffer – specifically, whether it contains quickly or slowly learned examples – has a significant effect on forgetting. Motivated by this insight, we introduce Speed-Based Sampling (SBS), a simple yet general strategy that selects replay examples based on their learning speed. SBS integrates easily into existing buffer-based methods and improves performance across a wide range of competitive continual learning benchmarks, advancing state-of-the-art results. Our findings underscore the value of accounting for the forgetting dynamics when designing continual learning algorithms.
APA
Hacohen, G. & Tuytelaars, T.. (2025). Predicting the Susceptibility of Examples to Catastrophic Forgetting. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:21546-21569 Available from https://proceedings.mlr.press/v267/hacohen25a.html.

Related Material