Parallel Algorithms Align With Neural Execution

Valerie Engelmayer, Dobrik Georgiev Georgiev, Petar Veličković
Proceedings of the Second Learning on Graphs Conference, PMLR 231:31:1-31:13, 2024.

Abstract

Neural algorithmic reasoners are parallel processors. Teaching them sequential algorithms contradicts this nature, rendering a significant share of their computations redundant. Parallel algorithms however may exploit their full computational power, therefore requiring fewer layers to be executed. This drastically reduces training times, as we observe when comparing parallel implementations of searching, sorting and finding strongly connected components to their sequential counterparts on the CLRS framework. Additionally, parallel versions achieve strongly superior predictive performance in most cases.

Cite this Paper


BibTeX
@InProceedings{pmlr-v231-engelmayer24a, title = {Parallel Algorithms Align With Neural Execution}, author = {Engelmayer, Valerie and Georgiev, Dobrik Georgiev and Veli{\v c}kovi{\' c}, Petar}, booktitle = {Proceedings of the Second Learning on Graphs Conference}, pages = {31:1--31:13}, year = {2024}, editor = {Villar, Soledad and Chamberlain, Benjamin}, volume = {231}, series = {Proceedings of Machine Learning Research}, month = {27--30 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v231/engelmayer24a/engelmayer24a.pdf}, url = {https://proceedings.mlr.press/v231/engelmayer24a.html}, abstract = {Neural algorithmic reasoners are parallel processors. Teaching them sequential algorithms contradicts this nature, rendering a significant share of their computations redundant. Parallel algorithms however may exploit their full computational power, therefore requiring fewer layers to be executed. This drastically reduces training times, as we observe when comparing parallel implementations of searching, sorting and finding strongly connected components to their sequential counterparts on the CLRS framework. Additionally, parallel versions achieve strongly superior predictive performance in most cases.} }
Endnote
%0 Conference Paper %T Parallel Algorithms Align With Neural Execution %A Valerie Engelmayer %A Dobrik Georgiev Georgiev %A Petar Veličković %B Proceedings of the Second Learning on Graphs Conference %C Proceedings of Machine Learning Research %D 2024 %E Soledad Villar %E Benjamin Chamberlain %F pmlr-v231-engelmayer24a %I PMLR %P 31:1--31:13 %U https://proceedings.mlr.press/v231/engelmayer24a.html %V 231 %X Neural algorithmic reasoners are parallel processors. Teaching them sequential algorithms contradicts this nature, rendering a significant share of their computations redundant. Parallel algorithms however may exploit their full computational power, therefore requiring fewer layers to be executed. This drastically reduces training times, as we observe when comparing parallel implementations of searching, sorting and finding strongly connected components to their sequential counterparts on the CLRS framework. Additionally, parallel versions achieve strongly superior predictive performance in most cases.
APA
Engelmayer, V., Georgiev, D.G. & Veličković, P.. (2024). Parallel Algorithms Align With Neural Execution. Proceedings of the Second Learning on Graphs Conference, in Proceedings of Machine Learning Research 231:31:1-31:13 Available from https://proceedings.mlr.press/v231/engelmayer24a.html.

Related Material