Learning to Continually Learn Rapidly from Few and Noisy Data

Nicholas I-Hsien Kuo, Mehrtash Harandi, Nicolas Fourrier, Christian Walder, Gabriela Ferraro, Hanna Suominen
AAAI Workshop on Meta-Learning and MetaDL Challenge, PMLR 140:65-76, 2021.

Abstract

Neural networks suffer from catastrophic forgetting and are unable to sequentially learn new tasks without guaranteed stationarity in data distribution. Continual learning could be achieved via replay – by concurrently training externally stored old data while learning a new task. However, replay becomes less effective when each past task is allocated with less memory. To overcome this difficulty, we supplemented replay mechanics with meta-learning for rapid knowledge acquisition. By employing a meta-learner, which learns a learning rate per parameter per past task, we found that base learners produced strong results when less memory was available. Additionally, our approach inherited several meta-learning advantages for continual learning: it demonstrated strong robustness to continually learn under the presence of noises and yielded base learners to higher accuracy in less updates.

Cite this Paper


BibTeX
@InProceedings{pmlr-v140-kuo21a, title = {Learning to Continually Learn Rapidly from Few and Noisy Data}, author = {Kuo, Nicholas I-Hsien and Harandi, Mehrtash and Fourrier, Nicolas and Walder, Christian and Ferraro, Gabriela and Suominen, Hanna}, booktitle = {AAAI Workshop on Meta-Learning and MetaDL Challenge}, pages = {65--76}, year = {2021}, editor = {Guyon, Isabelle and van Rijn, Jan N. and Treguer, Sébastien and Vanschoren, Joaquin}, volume = {140}, series = {Proceedings of Machine Learning Research}, month = {09 Feb}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v140/kuo21a/kuo21a.pdf}, url = {https://proceedings.mlr.press/v140/kuo21a.html}, abstract = {Neural networks suffer from catastrophic forgetting and are unable to sequentially learn new tasks without guaranteed stationarity in data distribution. Continual learning could be achieved via replay – by concurrently training externally stored old data while learning a new task. However, replay becomes less effective when each past task is allocated with less memory. To overcome this difficulty, we supplemented replay mechanics with meta-learning for rapid knowledge acquisition. By employing a meta-learner, which learns a learning rate per parameter per past task, we found that base learners produced strong results when less memory was available. Additionally, our approach inherited several meta-learning advantages for continual learning: it demonstrated strong robustness to continually learn under the presence of noises and yielded base learners to higher accuracy in less updates.} }
Endnote
%0 Conference Paper %T Learning to Continually Learn Rapidly from Few and Noisy Data %A Nicholas I-Hsien Kuo %A Mehrtash Harandi %A Nicolas Fourrier %A Christian Walder %A Gabriela Ferraro %A Hanna Suominen %B AAAI Workshop on Meta-Learning and MetaDL Challenge %C Proceedings of Machine Learning Research %D 2021 %E Isabelle Guyon %E Jan N. van Rijn %E Sébastien Treguer %E Joaquin Vanschoren %F pmlr-v140-kuo21a %I PMLR %P 65--76 %U https://proceedings.mlr.press/v140/kuo21a.html %V 140 %X Neural networks suffer from catastrophic forgetting and are unable to sequentially learn new tasks without guaranteed stationarity in data distribution. Continual learning could be achieved via replay – by concurrently training externally stored old data while learning a new task. However, replay becomes less effective when each past task is allocated with less memory. To overcome this difficulty, we supplemented replay mechanics with meta-learning for rapid knowledge acquisition. By employing a meta-learner, which learns a learning rate per parameter per past task, we found that base learners produced strong results when less memory was available. Additionally, our approach inherited several meta-learning advantages for continual learning: it demonstrated strong robustness to continually learn under the presence of noises and yielded base learners to higher accuracy in less updates.
APA
Kuo, N.I., Harandi, M., Fourrier, N., Walder, C., Ferraro, G. & Suominen, H.. (2021). Learning to Continually Learn Rapidly from Few and Noisy Data. AAAI Workshop on Meta-Learning and MetaDL Challenge, in Proceedings of Machine Learning Research 140:65-76 Available from https://proceedings.mlr.press/v140/kuo21a.html.

Related Material