Dynamical phases of short-term memory mechanisms in RNNs

Bariscan Kurtkaya, Fatih Dinc, Mert Yuksekgonul, Marta Blanco-Pozo, Ege Cirakman, Mark Schnitzer, Yucel Yemez, Hidenori Tanaka, Peng Yuan, Nina Miolane
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:32032-32062, 2025.

Abstract

Short-term memory is essential for cognitive processing, yet our understanding of its neural mechanisms remains unclear. Neuroscience has long focused on how sequential activity patterns, where neurons fire one after another within large networks, can explain how information is maintained. While recurrent connections were shown to drive sequential dynamics, a mechanistic understanding of this process still remains unknown. In this work, we introduce two unique mechanisms that can support this form of short-term memory: slow-point manifolds generating direct sequences or limit cycles providing temporally localized approximations. Using analytical models, we identify fundamental properties that govern the selection of each mechanism. Precisely, on short-term memory tasks (delayed cue-discrimination tasks), we derive theoretical scaling laws for critical learning rates as a function of the delay period length, beyond which no learning is possible. We empirically verify these results by training and evaluating approximately 80,000 recurrent neural networks (RNNs), which are publicly available for further analysis. Overall, our work provides new insights into short-term memory mechanisms and proposes experimentally testable predictions for systems neuroscience.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-kurtkaya25a, title = {Dynamical phases of short-term memory mechanisms in {RNN}s}, author = {Kurtkaya, Bariscan and Dinc, Fatih and Yuksekgonul, Mert and Blanco-Pozo, Marta and Cirakman, Ege and Schnitzer, Mark and Yemez, Yucel and Tanaka, Hidenori and Yuan, Peng and Miolane, Nina}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {32032--32062}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/kurtkaya25a/kurtkaya25a.pdf}, url = {https://proceedings.mlr.press/v267/kurtkaya25a.html}, abstract = {Short-term memory is essential for cognitive processing, yet our understanding of its neural mechanisms remains unclear. Neuroscience has long focused on how sequential activity patterns, where neurons fire one after another within large networks, can explain how information is maintained. While recurrent connections were shown to drive sequential dynamics, a mechanistic understanding of this process still remains unknown. In this work, we introduce two unique mechanisms that can support this form of short-term memory: slow-point manifolds generating direct sequences or limit cycles providing temporally localized approximations. Using analytical models, we identify fundamental properties that govern the selection of each mechanism. Precisely, on short-term memory tasks (delayed cue-discrimination tasks), we derive theoretical scaling laws for critical learning rates as a function of the delay period length, beyond which no learning is possible. We empirically verify these results by training and evaluating approximately 80,000 recurrent neural networks (RNNs), which are publicly available for further analysis. Overall, our work provides new insights into short-term memory mechanisms and proposes experimentally testable predictions for systems neuroscience.} }
Endnote
%0 Conference Paper %T Dynamical phases of short-term memory mechanisms in RNNs %A Bariscan Kurtkaya %A Fatih Dinc %A Mert Yuksekgonul %A Marta Blanco-Pozo %A Ege Cirakman %A Mark Schnitzer %A Yucel Yemez %A Hidenori Tanaka %A Peng Yuan %A Nina Miolane %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-kurtkaya25a %I PMLR %P 32032--32062 %U https://proceedings.mlr.press/v267/kurtkaya25a.html %V 267 %X Short-term memory is essential for cognitive processing, yet our understanding of its neural mechanisms remains unclear. Neuroscience has long focused on how sequential activity patterns, where neurons fire one after another within large networks, can explain how information is maintained. While recurrent connections were shown to drive sequential dynamics, a mechanistic understanding of this process still remains unknown. In this work, we introduce two unique mechanisms that can support this form of short-term memory: slow-point manifolds generating direct sequences or limit cycles providing temporally localized approximations. Using analytical models, we identify fundamental properties that govern the selection of each mechanism. Precisely, on short-term memory tasks (delayed cue-discrimination tasks), we derive theoretical scaling laws for critical learning rates as a function of the delay period length, beyond which no learning is possible. We empirically verify these results by training and evaluating approximately 80,000 recurrent neural networks (RNNs), which are publicly available for further analysis. Overall, our work provides new insights into short-term memory mechanisms and proposes experimentally testable predictions for systems neuroscience.
APA
Kurtkaya, B., Dinc, F., Yuksekgonul, M., Blanco-Pozo, M., Cirakman, E., Schnitzer, M., Yemez, Y., Tanaka, H., Yuan, P. & Miolane, N.. (2025). Dynamical phases of short-term memory mechanisms in RNNs. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:32032-32062 Available from https://proceedings.mlr.press/v267/kurtkaya25a.html.

Related Material