Associative Memory in Iterated Overparameterized Sigmoid Autoencoders

Yibo Jiang, Cengiz Pehlevan
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:4828-4838, 2020.

Abstract

Recent work showed that overparameterized autoencoders can be trained to implement associative memory via iterative maps, when the trained input-output Jacobian of the network has all of its eigenvalue norms strictly below one. Here, we theoretically analyze this phenomenon for sigmoid networks by leveraging recent developments in deep learning theory, especially the correspondence between training neural networks in the infinite-width limit and performing kernel regression with the Neural Tangent Kernel (NTK). We find that overparameterized sigmoid autoencoders can have attractors in the NTK limit for both training with a single example and multiple examples under certain conditions. In particular, for multiple training examples, we find that the norm of the largest Jacobian eigenvalue drops below one with increasing input norm, leading to associative memory.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-jiang20e, title = {Associative Memory in Iterated Overparameterized Sigmoid Autoencoders}, author = {Jiang, Yibo and Pehlevan, Cengiz}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {4828--4838}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/jiang20e/jiang20e.pdf}, url = {https://proceedings.mlr.press/v119/jiang20e.html}, abstract = {Recent work showed that overparameterized autoencoders can be trained to implement associative memory via iterative maps, when the trained input-output Jacobian of the network has all of its eigenvalue norms strictly below one. Here, we theoretically analyze this phenomenon for sigmoid networks by leveraging recent developments in deep learning theory, especially the correspondence between training neural networks in the infinite-width limit and performing kernel regression with the Neural Tangent Kernel (NTK). We find that overparameterized sigmoid autoencoders can have attractors in the NTK limit for both training with a single example and multiple examples under certain conditions. In particular, for multiple training examples, we find that the norm of the largest Jacobian eigenvalue drops below one with increasing input norm, leading to associative memory.} }
Endnote
%0 Conference Paper %T Associative Memory in Iterated Overparameterized Sigmoid Autoencoders %A Yibo Jiang %A Cengiz Pehlevan %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-jiang20e %I PMLR %P 4828--4838 %U https://proceedings.mlr.press/v119/jiang20e.html %V 119 %X Recent work showed that overparameterized autoencoders can be trained to implement associative memory via iterative maps, when the trained input-output Jacobian of the network has all of its eigenvalue norms strictly below one. Here, we theoretically analyze this phenomenon for sigmoid networks by leveraging recent developments in deep learning theory, especially the correspondence between training neural networks in the infinite-width limit and performing kernel regression with the Neural Tangent Kernel (NTK). We find that overparameterized sigmoid autoencoders can have attractors in the NTK limit for both training with a single example and multiple examples under certain conditions. In particular, for multiple training examples, we find that the norm of the largest Jacobian eigenvalue drops below one with increasing input norm, leading to associative memory.
APA
Jiang, Y. & Pehlevan, C.. (2020). Associative Memory in Iterated Overparameterized Sigmoid Autoencoders. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:4828-4838 Available from https://proceedings.mlr.press/v119/jiang20e.html.

Related Material