Optimal Continual Learning has Perfect Memory and is NP-hard

Jeremias Knoblauch, Hisham Husain, Tom Diethe
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:5327-5337, 2020.

Abstract

Continual Learning (CL) algorithms incrementally learn a predictor or representation across multiple sequentially observed tasks. Designing CL algorithms that perform reliably and avoid so-called catastrophic forgetting has proven a persistent challenge. The current paper develops a theoretical approach that explains why. In particular, we derive the computational properties which CL algorithms would have to possess in order to avoid catastrophic forgetting. Our main finding is that such optimal CL algorithms generally solve an NP-hard problem and will require perfect memory to do so. The findings are of theoretical interest, but also explain the excellent performance of CL algorithms using experience replay, episodic memory and core sets relative to regularization-based approaches.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-knoblauch20a, title = {Optimal Continual Learning has Perfect Memory and is {NP}-hard}, author = {Knoblauch, Jeremias and Husain, Hisham and Diethe, Tom}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {5327--5337}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/knoblauch20a/knoblauch20a.pdf}, url = {https://proceedings.mlr.press/v119/knoblauch20a.html}, abstract = {Continual Learning (CL) algorithms incrementally learn a predictor or representation across multiple sequentially observed tasks. Designing CL algorithms that perform reliably and avoid so-called catastrophic forgetting has proven a persistent challenge. The current paper develops a theoretical approach that explains why. In particular, we derive the computational properties which CL algorithms would have to possess in order to avoid catastrophic forgetting. Our main finding is that such optimal CL algorithms generally solve an NP-hard problem and will require perfect memory to do so. The findings are of theoretical interest, but also explain the excellent performance of CL algorithms using experience replay, episodic memory and core sets relative to regularization-based approaches.} }
Endnote
%0 Conference Paper %T Optimal Continual Learning has Perfect Memory and is NP-hard %A Jeremias Knoblauch %A Hisham Husain %A Tom Diethe %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-knoblauch20a %I PMLR %P 5327--5337 %U https://proceedings.mlr.press/v119/knoblauch20a.html %V 119 %X Continual Learning (CL) algorithms incrementally learn a predictor or representation across multiple sequentially observed tasks. Designing CL algorithms that perform reliably and avoid so-called catastrophic forgetting has proven a persistent challenge. The current paper develops a theoretical approach that explains why. In particular, we derive the computational properties which CL algorithms would have to possess in order to avoid catastrophic forgetting. Our main finding is that such optimal CL algorithms generally solve an NP-hard problem and will require perfect memory to do so. The findings are of theoretical interest, but also explain the excellent performance of CL algorithms using experience replay, episodic memory and core sets relative to regularization-based approaches.
APA
Knoblauch, J., Husain, H. & Diethe, T.. (2020). Optimal Continual Learning has Perfect Memory and is NP-hard. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:5327-5337 Available from https://proceedings.mlr.press/v119/knoblauch20a.html.

Related Material