Self-Attentive Associative Memory

Hung Le, Truyen Tran, Svetha Venkatesh
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:5682-5691, 2020.

Abstract

Heretofore, neural networks with external memory are restricted to single memory with lossy representations of memory interactions. A rich representation of relationships between memory pieces urges a high-order and segregated relational memory. In this paper, we propose to separate the storage of individual experiences (item memory) and their occurring relationships (relational memory). The idea is implemented through a novel Self-attentive Associative Memory (SAM) operator. Found upon outer product, SAM forms a set of associative memories that represent the hypothetical high-order relationships between arbitrary pairs of memory elements, through which a relational memory is constructed from an item memory. The two memories are wired into a single sequential model capable of both memorization and relational reasoning. We achieve competitive results with our proposed two-memory model in a diversity of machine learning tasks, from challenging synthetic problems to practical testbeds such as geometry, graph, reinforcement learning, and question answering.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-le20b, title = {Self-Attentive Associative Memory}, author = {Le, Hung and Tran, Truyen and Venkatesh, Svetha}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {5682--5691}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/le20b/le20b.pdf}, url = {https://proceedings.mlr.press/v119/le20b.html}, abstract = {Heretofore, neural networks with external memory are restricted to single memory with lossy representations of memory interactions. A rich representation of relationships between memory pieces urges a high-order and segregated relational memory. In this paper, we propose to separate the storage of individual experiences (item memory) and their occurring relationships (relational memory). The idea is implemented through a novel Self-attentive Associative Memory (SAM) operator. Found upon outer product, SAM forms a set of associative memories that represent the hypothetical high-order relationships between arbitrary pairs of memory elements, through which a relational memory is constructed from an item memory. The two memories are wired into a single sequential model capable of both memorization and relational reasoning. We achieve competitive results with our proposed two-memory model in a diversity of machine learning tasks, from challenging synthetic problems to practical testbeds such as geometry, graph, reinforcement learning, and question answering.} }
Endnote
%0 Conference Paper %T Self-Attentive Associative Memory %A Hung Le %A Truyen Tran %A Svetha Venkatesh %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-le20b %I PMLR %P 5682--5691 %U https://proceedings.mlr.press/v119/le20b.html %V 119 %X Heretofore, neural networks with external memory are restricted to single memory with lossy representations of memory interactions. A rich representation of relationships between memory pieces urges a high-order and segregated relational memory. In this paper, we propose to separate the storage of individual experiences (item memory) and their occurring relationships (relational memory). The idea is implemented through a novel Self-attentive Associative Memory (SAM) operator. Found upon outer product, SAM forms a set of associative memories that represent the hypothetical high-order relationships between arbitrary pairs of memory elements, through which a relational memory is constructed from an item memory. The two memories are wired into a single sequential model capable of both memorization and relational reasoning. We achieve competitive results with our proposed two-memory model in a diversity of machine learning tasks, from challenging synthetic problems to practical testbeds such as geometry, graph, reinforcement learning, and question answering.
APA
Le, H., Tran, T. & Venkatesh, S.. (2020). Self-Attentive Associative Memory. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:5682-5691 Available from https://proceedings.mlr.press/v119/le20b.html.

Related Material