DNCs Require More Planning Steps

Yara Shamshoum, Nitzan Hodos, Yuval Sieradzki, Assaf Schuster
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:44358-44377, 2024.

Abstract

Many recent works use machine learning models to solve various complex algorithmic problems. However, these models attempt to reach a solution without considering the problem’s required computational complexity, which can be detrimental to their ability to solve it correctly. In this work we investigate the effect of computational time and memory on generalization of implicit algorithmic solvers. To do so, we focus on the Differentiable Neural Computer (DNC), a general problem solver that also lets us reason directly about its usage of time and memory. In this work, we argue that the number of planning steps the model is allowed to take, which we call ”planning budget”, is a constraint that can cause the model to generalize poorly and hurt its ability to fully utilize its external memory. We evaluate our method on Graph Shortest Path, Convex Hull, Graph MinCut and Associative Recall, and show how the planning budget can drastically change the behavior of the learned algorithm, in terms of learned time complexity, training time, stability and generalization to inputs larger than those seen during training.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-shamshoum24a, title = {{DNC}s Require More Planning Steps}, author = {Shamshoum, Yara and Hodos, Nitzan and Sieradzki, Yuval and Schuster, Assaf}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {44358--44377}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/shamshoum24a/shamshoum24a.pdf}, url = {https://proceedings.mlr.press/v235/shamshoum24a.html}, abstract = {Many recent works use machine learning models to solve various complex algorithmic problems. However, these models attempt to reach a solution without considering the problem’s required computational complexity, which can be detrimental to their ability to solve it correctly. In this work we investigate the effect of computational time and memory on generalization of implicit algorithmic solvers. To do so, we focus on the Differentiable Neural Computer (DNC), a general problem solver that also lets us reason directly about its usage of time and memory. In this work, we argue that the number of planning steps the model is allowed to take, which we call ”planning budget”, is a constraint that can cause the model to generalize poorly and hurt its ability to fully utilize its external memory. We evaluate our method on Graph Shortest Path, Convex Hull, Graph MinCut and Associative Recall, and show how the planning budget can drastically change the behavior of the learned algorithm, in terms of learned time complexity, training time, stability and generalization to inputs larger than those seen during training.} }
Endnote
%0 Conference Paper %T DNCs Require More Planning Steps %A Yara Shamshoum %A Nitzan Hodos %A Yuval Sieradzki %A Assaf Schuster %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-shamshoum24a %I PMLR %P 44358--44377 %U https://proceedings.mlr.press/v235/shamshoum24a.html %V 235 %X Many recent works use machine learning models to solve various complex algorithmic problems. However, these models attempt to reach a solution without considering the problem’s required computational complexity, which can be detrimental to their ability to solve it correctly. In this work we investigate the effect of computational time and memory on generalization of implicit algorithmic solvers. To do so, we focus on the Differentiable Neural Computer (DNC), a general problem solver that also lets us reason directly about its usage of time and memory. In this work, we argue that the number of planning steps the model is allowed to take, which we call ”planning budget”, is a constraint that can cause the model to generalize poorly and hurt its ability to fully utilize its external memory. We evaluate our method on Graph Shortest Path, Convex Hull, Graph MinCut and Associative Recall, and show how the planning budget can drastically change the behavior of the learned algorithm, in terms of learned time complexity, training time, stability and generalization to inputs larger than those seen during training.
APA
Shamshoum, Y., Hodos, N., Sieradzki, Y. & Schuster, A.. (2024). DNCs Require More Planning Steps. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:44358-44377 Available from https://proceedings.mlr.press/v235/shamshoum24a.html.

Related Material