What Makes a Good Feedforward Computational Graph?

Alex Vitvitskyi, João Guilherme Madeira Araújo, Marc Lackenby, Petar Veličković
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:61635-61651, 2025.

Abstract

As implied by the plethora of literature on graph rewiring, the choice of computational graph employed by a neural network can make a significant impact on its downstream performance. Certain effects related to the computational graph, such as under-reaching and over-squashing, may even render the model incapable of learning certain functions. Most of these effects have only been thoroughly studied in the domain of undirected graphs; however, recent years have seen a significant rise in interest in feedforward computational graphs: directed graphs without any back edges. In this paper, we study the desirable properties of a feedforward computational graph, discovering two important complementary measures: fidelity and mixing time, and evaluating a few popular choices of graphs through the lens of these measures. Our study is backed by both theoretical analyses of the metrics’ asymptotic behaviour for various graphs, as well as correlating these metrics to the performance of trained neural network models using the corresponding graphs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-vitvitskyi25a, title = {What Makes a Good Feedforward Computational Graph?}, author = {Vitvitskyi, Alex and Ara\'{u}jo, Jo\~{a}o Guilherme Madeira and Lackenby, Marc and Veli\v{c}kovi\'{c}, Petar}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {61635--61651}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/vitvitskyi25a/vitvitskyi25a.pdf}, url = {https://proceedings.mlr.press/v267/vitvitskyi25a.html}, abstract = {As implied by the plethora of literature on graph rewiring, the choice of computational graph employed by a neural network can make a significant impact on its downstream performance. Certain effects related to the computational graph, such as under-reaching and over-squashing, may even render the model incapable of learning certain functions. Most of these effects have only been thoroughly studied in the domain of undirected graphs; however, recent years have seen a significant rise in interest in feedforward computational graphs: directed graphs without any back edges. In this paper, we study the desirable properties of a feedforward computational graph, discovering two important complementary measures: fidelity and mixing time, and evaluating a few popular choices of graphs through the lens of these measures. Our study is backed by both theoretical analyses of the metrics’ asymptotic behaviour for various graphs, as well as correlating these metrics to the performance of trained neural network models using the corresponding graphs.} }
Endnote
%0 Conference Paper %T What Makes a Good Feedforward Computational Graph? %A Alex Vitvitskyi %A João Guilherme Madeira Araújo %A Marc Lackenby %A Petar Veličković %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-vitvitskyi25a %I PMLR %P 61635--61651 %U https://proceedings.mlr.press/v267/vitvitskyi25a.html %V 267 %X As implied by the plethora of literature on graph rewiring, the choice of computational graph employed by a neural network can make a significant impact on its downstream performance. Certain effects related to the computational graph, such as under-reaching and over-squashing, may even render the model incapable of learning certain functions. Most of these effects have only been thoroughly studied in the domain of undirected graphs; however, recent years have seen a significant rise in interest in feedforward computational graphs: directed graphs without any back edges. In this paper, we study the desirable properties of a feedforward computational graph, discovering two important complementary measures: fidelity and mixing time, and evaluating a few popular choices of graphs through the lens of these measures. Our study is backed by both theoretical analyses of the metrics’ asymptotic behaviour for various graphs, as well as correlating these metrics to the performance of trained neural network models using the corresponding graphs.
APA
Vitvitskyi, A., Araújo, J.G.M., Lackenby, M. & Veličković, P.. (2025). What Makes a Good Feedforward Computational Graph?. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:61635-61651 Available from https://proceedings.mlr.press/v267/vitvitskyi25a.html.

Related Material