Testing the Genomic Bottleneck Hypothesis in Hebbian Meta-Learning

Rasmus Berg Palm, Elias Najarro, Sebastian Risi
NeurIPS 2020 Workshop on Pre-registration in Machine Learning, PMLR 148:100-110, 2021.

Abstract

Hebbian meta-learning has recently shown promise to solve hard reinforcement learning problems, allowing agents to adapt to some degree to changes in the environment. However, because each synapse in these approaches can learn a very specific learning rule, the ability to generalize to very different situations is likely reduced. We hypothesize that limiting the number of Hebbian learning rules through a “genomic bottleneck” can act as a regularizer leading to better generalization across changes to the environment. We test this hypothesis by decoupling the number of Hebbian learning rules from the number of synapses and systematically varying the number of Hebbian learning rules. The results in this paper suggest that simultaneously learning the Hebbian learning rules and their assignment to synapses is a difficult optimization problem, leading to poor performance in the environments tested. However, parallel research to ours finds that it is indeed possible to reduce the number of learning rules by clustering similar rules together. How to best implement a “genomic bottleneck” algorithm is thus an important research direction that warrants further investigation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v148-palm21a, title = {Testing the Genomic Bottleneck Hypothesis in Hebbian Meta-Learning}, author = {Palm, Rasmus Berg and Najarro, Elias and Risi, Sebastian}, booktitle = {NeurIPS 2020 Workshop on Pre-registration in Machine Learning}, pages = {100--110}, year = {2021}, editor = {Bertinetto, Luca and Henriques, João F. and Albanie, Samuel and Paganini, Michela and Varol, Gül}, volume = {148}, series = {Proceedings of Machine Learning Research}, month = {11 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v148/palm21a/palm21a.pdf}, url = {https://proceedings.mlr.press/v148/palm21a.html}, abstract = {Hebbian meta-learning has recently shown promise to solve hard reinforcement learning problems, allowing agents to adapt to some degree to changes in the environment. However, because each synapse in these approaches can learn a very specific learning rule, the ability to generalize to very different situations is likely reduced. We hypothesize that limiting the number of Hebbian learning rules through a “genomic bottleneck” can act as a regularizer leading to better generalization across changes to the environment. We test this hypothesis by decoupling the number of Hebbian learning rules from the number of synapses and systematically varying the number of Hebbian learning rules. The results in this paper suggest that simultaneously learning the Hebbian learning rules and their assignment to synapses is a difficult optimization problem, leading to poor performance in the environments tested. However, parallel research to ours finds that it is indeed possible to reduce the number of learning rules by clustering similar rules together. How to best implement a “genomic bottleneck” algorithm is thus an important research direction that warrants further investigation.} }
Endnote
%0 Conference Paper %T Testing the Genomic Bottleneck Hypothesis in Hebbian Meta-Learning %A Rasmus Berg Palm %A Elias Najarro %A Sebastian Risi %B NeurIPS 2020 Workshop on Pre-registration in Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Luca Bertinetto %E João F. Henriques %E Samuel Albanie %E Michela Paganini %E Gül Varol %F pmlr-v148-palm21a %I PMLR %P 100--110 %U https://proceedings.mlr.press/v148/palm21a.html %V 148 %X Hebbian meta-learning has recently shown promise to solve hard reinforcement learning problems, allowing agents to adapt to some degree to changes in the environment. However, because each synapse in these approaches can learn a very specific learning rule, the ability to generalize to very different situations is likely reduced. We hypothesize that limiting the number of Hebbian learning rules through a “genomic bottleneck” can act as a regularizer leading to better generalization across changes to the environment. We test this hypothesis by decoupling the number of Hebbian learning rules from the number of synapses and systematically varying the number of Hebbian learning rules. The results in this paper suggest that simultaneously learning the Hebbian learning rules and their assignment to synapses is a difficult optimization problem, leading to poor performance in the environments tested. However, parallel research to ours finds that it is indeed possible to reduce the number of learning rules by clustering similar rules together. How to best implement a “genomic bottleneck” algorithm is thus an important research direction that warrants further investigation.
APA
Palm, R.B., Najarro, E. & Risi, S.. (2021). Testing the Genomic Bottleneck Hypothesis in Hebbian Meta-Learning. NeurIPS 2020 Workshop on Pre-registration in Machine Learning, in Proceedings of Machine Learning Research 148:100-110 Available from https://proceedings.mlr.press/v148/palm21a.html.

Related Material