[edit]
Cayley Graph Propagation
Proceedings of the Third Learning on Graphs Conference, PMLR 269:8:1-8:20, 2025.
Abstract
In spite of the plethora of success stories with graph neural networks (GNNs) on modelling graph-structured data, they are notoriously vulnerable to over-squashing, whereby tasks necessitate the mixing of information between distance pairs of nodes. To address this problem, prior work suggests rewiring the graph structure to improve information flow. Alternatively, a significant body of research has dedicated itself to discovering and pre-computing bottleneck-free graph structures to ameliorate over-squashing. One such family of bottleneck-free graphs well regarded in the mathematical community are \emph{expander graphs}, with prior work—Expander Graph Propagation (EGP)—proposing the use of a well-known expander graph family—the Cayley graphs of the \textdollar \mathrm{SL}(2,\mathbb{Z}_n)\textdollar special linear group—as a computational template for GNNs. However, despite its solid theoretical grounding, the computational graphs used by EGP are truncated to align with a given input graph. In this work, we show that such an approach is detrimental to the coveted expansion properties, and instead propose a method that propagates information over the complete Cayley graph structure, thereby ensuring it is bottleneck-free to better alleviate over-squashing. Our empirical evidence across several real-world datasets not only shows our method recovers significant improvements as compared to EGP, but also akin to or outperforming computationally complex graph rewiring techniques.