Continual Learning Through Synaptic Intelligence

Friedemann Zenke, Ben Poole, Surya Ganguli
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:3987-3995, 2017.

Abstract

While deep learning has led to remarkable advances across diverse applications, it struggles in domains where the data distribution changes over the course of learning. In stark contrast, biological neural networks continually adapt to changing domains, possibly by leveraging complex molecular machinery to solve many tasks simultaneously. In this study, we introduce intelligent synapses that bring some of this biological complexity into artificial neural networks. Each synapse accumulates task relevant information over time, and exploits this information to rapidly store new memories without forgetting old ones. We evaluate our approach on continual learning of classification tasks, and show that it dramatically reduces forgetting while maintaining computational efficiency.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-zenke17a, title = {Continual Learning Through Synaptic Intelligence}, author = {Friedemann Zenke and Ben Poole and Surya Ganguli}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {3987--3995}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/zenke17a/zenke17a.pdf}, url = { http://proceedings.mlr.press/v70/zenke17a.html }, abstract = {While deep learning has led to remarkable advances across diverse applications, it struggles in domains where the data distribution changes over the course of learning. In stark contrast, biological neural networks continually adapt to changing domains, possibly by leveraging complex molecular machinery to solve many tasks simultaneously. In this study, we introduce intelligent synapses that bring some of this biological complexity into artificial neural networks. Each synapse accumulates task relevant information over time, and exploits this information to rapidly store new memories without forgetting old ones. We evaluate our approach on continual learning of classification tasks, and show that it dramatically reduces forgetting while maintaining computational efficiency.} }
Endnote
%0 Conference Paper %T Continual Learning Through Synaptic Intelligence %A Friedemann Zenke %A Ben Poole %A Surya Ganguli %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-zenke17a %I PMLR %P 3987--3995 %U http://proceedings.mlr.press/v70/zenke17a.html %V 70 %X While deep learning has led to remarkable advances across diverse applications, it struggles in domains where the data distribution changes over the course of learning. In stark contrast, biological neural networks continually adapt to changing domains, possibly by leveraging complex molecular machinery to solve many tasks simultaneously. In this study, we introduce intelligent synapses that bring some of this biological complexity into artificial neural networks. Each synapse accumulates task relevant information over time, and exploits this information to rapidly store new memories without forgetting old ones. We evaluate our approach on continual learning of classification tasks, and show that it dramatically reduces forgetting while maintaining computational efficiency.
APA
Zenke, F., Poole, B. & Ganguli, S.. (2017). Continual Learning Through Synaptic Intelligence. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:3987-3995 Available from http://proceedings.mlr.press/v70/zenke17a.html .

Related Material