Cayley Graph Propagation

JJ Wilson, Maya Bechler-Speicher, Petar Veličković
Proceedings of the Third Learning on Graphs Conference, PMLR 269:8:1-8:20, 2025.

Abstract

In spite of the plethora of success stories with graph neural networks (GNNs) on modelling graph-structured data, they are notoriously vulnerable to over-squashing, whereby tasks necessitate the mixing of information between distance pairs of nodes. To address this problem, prior work suggests rewiring the graph structure to improve information flow. Alternatively, a significant body of research has dedicated itself to discovering and pre-computing bottleneck-free graph structures to ameliorate over-squashing. One such family of bottleneck-free graphs well regarded in the mathematical community are \emph{expander graphs}, with prior work—Expander Graph Propagation (EGP)—proposing the use of a well-known expander graph family—the Cayley graphs of the \textdollar \mathrm{SL}(2,\mathbb{Z}_n)\textdollar special linear group—as a computational template for GNNs. However, despite its solid theoretical grounding, the computational graphs used by EGP are truncated to align with a given input graph. In this work, we show that such an approach is detrimental to the coveted expansion properties, and instead propose a method that propagates information over the complete Cayley graph structure, thereby ensuring it is bottleneck-free to better alleviate over-squashing. Our empirical evidence across several real-world datasets not only shows our method recovers significant improvements as compared to EGP, but also akin to or outperforming computationally complex graph rewiring techniques.

Cite this Paper


BibTeX
@InProceedings{pmlr-v269-wilson25a, title = {Cayley Graph Propagation}, author = {Wilson, JJ and Bechler-Speicher, Maya and Veli{\v c}kovi{\' c}, Petar}, booktitle = {Proceedings of the Third Learning on Graphs Conference}, pages = {8:1--8:20}, year = {2025}, editor = {Wolf, Guy and Krishnaswamy, Smita}, volume = {269}, series = {Proceedings of Machine Learning Research}, month = {26--29 Nov}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v269/main/assets/wilson25a/wilson25a.pdf}, url = {https://proceedings.mlr.press/v269/wilson25a.html}, abstract = {In spite of the plethora of success stories with graph neural networks (GNNs) on modelling graph-structured data, they are notoriously vulnerable to over-squashing, whereby tasks necessitate the mixing of information between distance pairs of nodes. To address this problem, prior work suggests rewiring the graph structure to improve information flow. Alternatively, a significant body of research has dedicated itself to discovering and pre-computing bottleneck-free graph structures to ameliorate over-squashing. One such family of bottleneck-free graphs well regarded in the mathematical community are \emph{expander graphs}, with prior work—Expander Graph Propagation (EGP)—proposing the use of a well-known expander graph family—the Cayley graphs of the \textdollar \mathrm{SL}(2,\mathbb{Z}_n)\textdollar special linear group—as a computational template for GNNs. However, despite its solid theoretical grounding, the computational graphs used by EGP are truncated to align with a given input graph. In this work, we show that such an approach is detrimental to the coveted expansion properties, and instead propose a method that propagates information over the complete Cayley graph structure, thereby ensuring it is bottleneck-free to better alleviate over-squashing. Our empirical evidence across several real-world datasets not only shows our method recovers significant improvements as compared to EGP, but also akin to or outperforming computationally complex graph rewiring techniques.} }
Endnote
%0 Conference Paper %T Cayley Graph Propagation %A JJ Wilson %A Maya Bechler-Speicher %A Petar Veličković %B Proceedings of the Third Learning on Graphs Conference %C Proceedings of Machine Learning Research %D 2025 %E Guy Wolf %E Smita Krishnaswamy %F pmlr-v269-wilson25a %I PMLR %P 8:1--8:20 %U https://proceedings.mlr.press/v269/wilson25a.html %V 269 %X In spite of the plethora of success stories with graph neural networks (GNNs) on modelling graph-structured data, they are notoriously vulnerable to over-squashing, whereby tasks necessitate the mixing of information between distance pairs of nodes. To address this problem, prior work suggests rewiring the graph structure to improve information flow. Alternatively, a significant body of research has dedicated itself to discovering and pre-computing bottleneck-free graph structures to ameliorate over-squashing. One such family of bottleneck-free graphs well regarded in the mathematical community are \emph{expander graphs}, with prior work—Expander Graph Propagation (EGP)—proposing the use of a well-known expander graph family—the Cayley graphs of the \textdollar \mathrm{SL}(2,\mathbb{Z}_n)\textdollar special linear group—as a computational template for GNNs. However, despite its solid theoretical grounding, the computational graphs used by EGP are truncated to align with a given input graph. In this work, we show that such an approach is detrimental to the coveted expansion properties, and instead propose a method that propagates information over the complete Cayley graph structure, thereby ensuring it is bottleneck-free to better alleviate over-squashing. Our empirical evidence across several real-world datasets not only shows our method recovers significant improvements as compared to EGP, but also akin to or outperforming computationally complex graph rewiring techniques.
APA
Wilson, J., Bechler-Speicher, M. & Veličković, P.. (2025). Cayley Graph Propagation. Proceedings of the Third Learning on Graphs Conference, in Proceedings of Machine Learning Research 269:8:1-8:20 Available from https://proceedings.mlr.press/v269/wilson25a.html.

Related Material