Universal Hopfield Networks: A General Framework for Single-Shot Associative Memory Models

Beren Millidge, Tommaso Salvatori, Yuhang Song, Thomas Lukasiewicz, Rafal Bogacz
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:15561-15583, 2022.

Abstract

A large number of neural network models of associative memory have been proposed in the literature. These include the classical Hopfield networks (HNs), sparse distributed memories (SDMs), and more recently the modern continuous Hopfield networks (MCHNs), which possess close links with self-attention in machine learning. In this paper, we propose a general framework for understanding the operation of such memory networks as a sequence of three operations: similarity, separation, and projection. We derive all these memory models as instances of our general framework with differing similarity and separation functions. We extend the mathematical framework of Krotov et al (2020) to express general associative memory models using neural network dynamics with local computation, and derive a general energy function that is a Lyapunov function of the dynamics. Finally, using our framework, we empirically investigate the capacity of using different similarity functions for these associative memory models, beyond the dot product similarity measure, and demonstrate empirically that Euclidean or Manhattan distance similarity metrics perform substantially better in practice on many tasks, enabling a more robust retrieval and higher memory capacity than existing models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-millidge22a, title = {Universal Hopfield Networks: A General Framework for Single-Shot Associative Memory Models}, author = {Millidge, Beren and Salvatori, Tommaso and Song, Yuhang and Lukasiewicz, Thomas and Bogacz, Rafal}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {15561--15583}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/millidge22a/millidge22a.pdf}, url = {https://proceedings.mlr.press/v162/millidge22a.html}, abstract = {A large number of neural network models of associative memory have been proposed in the literature. These include the classical Hopfield networks (HNs), sparse distributed memories (SDMs), and more recently the modern continuous Hopfield networks (MCHNs), which possess close links with self-attention in machine learning. In this paper, we propose a general framework for understanding the operation of such memory networks as a sequence of three operations: similarity, separation, and projection. We derive all these memory models as instances of our general framework with differing similarity and separation functions. We extend the mathematical framework of Krotov et al (2020) to express general associative memory models using neural network dynamics with local computation, and derive a general energy function that is a Lyapunov function of the dynamics. Finally, using our framework, we empirically investigate the capacity of using different similarity functions for these associative memory models, beyond the dot product similarity measure, and demonstrate empirically that Euclidean or Manhattan distance similarity metrics perform substantially better in practice on many tasks, enabling a more robust retrieval and higher memory capacity than existing models.} }
Endnote
%0 Conference Paper %T Universal Hopfield Networks: A General Framework for Single-Shot Associative Memory Models %A Beren Millidge %A Tommaso Salvatori %A Yuhang Song %A Thomas Lukasiewicz %A Rafal Bogacz %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-millidge22a %I PMLR %P 15561--15583 %U https://proceedings.mlr.press/v162/millidge22a.html %V 162 %X A large number of neural network models of associative memory have been proposed in the literature. These include the classical Hopfield networks (HNs), sparse distributed memories (SDMs), and more recently the modern continuous Hopfield networks (MCHNs), which possess close links with self-attention in machine learning. In this paper, we propose a general framework for understanding the operation of such memory networks as a sequence of three operations: similarity, separation, and projection. We derive all these memory models as instances of our general framework with differing similarity and separation functions. We extend the mathematical framework of Krotov et al (2020) to express general associative memory models using neural network dynamics with local computation, and derive a general energy function that is a Lyapunov function of the dynamics. Finally, using our framework, we empirically investigate the capacity of using different similarity functions for these associative memory models, beyond the dot product similarity measure, and demonstrate empirically that Euclidean or Manhattan distance similarity metrics perform substantially better in practice on many tasks, enabling a more robust retrieval and higher memory capacity than existing models.
APA
Millidge, B., Salvatori, T., Song, Y., Lukasiewicz, T. & Bogacz, R.. (2022). Universal Hopfield Networks: A General Framework for Single-Shot Associative Memory Models. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:15561-15583 Available from https://proceedings.mlr.press/v162/millidge22a.html.

Related Material