Enhancing Topological Dependencies in Spatio-Temporal Graphs With Cycle Message Passing Blocks

Minho Lee, Yun Young Choi, Sun Woo Park, Seunghwan Lee, Joohwan Ko, Jaeyoung Hong
Proceedings of the Third Learning on Graphs Conference, PMLR 269:33:1-33:17, 2025.

Abstract

Graph Neural Networks (GNNs) and Transformer-based models have been increasingly adopted to learn the complex vector representations of spatio-temporal graphs, capturing intricate spatio-temporal dependencies crucial for applications such as traffic datasets. Although many existing methods utilize multi-head attention mechanisms and message-passing neural networks (MPNNs) to capture both spatial and temporal relations, these approaches encode temporal and spatial relations independently, and reflect the graph’s topological characteristics in a limited manner. In this work, we introduce the Cycle to Mixer (Cy2Mixer), a novel spatio-temporal GNN based on topological non-trivial invariants of spatio-temporal graphs with gated multi-layer perceptrons (gMLP). The Cy2Mixer is composed of three blocks based on MLPs: A temporal block for capturing temporal properties, a message-passing block for encapsulating spatial information, and a cycle message-passing block for enriching topological information through cyclic subgraphs. We bolster the effectiveness of Cy2Mixer with mathematical evidence emphasizing that our cycle message-passing block is capable of offering differentiated information to the deep learning model compared to the message-passing block. Furthermore, empirical evaluations substantiate the efficacy of the Cy2Mixer, demonstrating state-of-the-art performances across various spatio-temporal benchmark datasets. The source code is available at https://anonymous.4open.science/r/cy2mixer-D5A9.

Cite this Paper


BibTeX
@InProceedings{pmlr-v269-lee25a, title = {Enhancing Topological Dependencies in Spatio-Temporal Graphs With Cycle Message Passing Blocks}, author = {Lee, Minho and Choi, Yun Young and Park, Sun Woo and Lee, Seunghwan and Ko, Joohwan and Hong, Jaeyoung}, booktitle = {Proceedings of the Third Learning on Graphs Conference}, pages = {33:1--33:17}, year = {2025}, editor = {Wolf, Guy and Krishnaswamy, Smita}, volume = {269}, series = {Proceedings of Machine Learning Research}, month = {26--29 Nov}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v269/main/assets/lee25a/lee25a.pdf}, url = {https://proceedings.mlr.press/v269/lee25a.html}, abstract = {Graph Neural Networks (GNNs) and Transformer-based models have been increasingly adopted to learn the complex vector representations of spatio-temporal graphs, capturing intricate spatio-temporal dependencies crucial for applications such as traffic datasets. Although many existing methods utilize multi-head attention mechanisms and message-passing neural networks (MPNNs) to capture both spatial and temporal relations, these approaches encode temporal and spatial relations independently, and reflect the graph’s topological characteristics in a limited manner. In this work, we introduce the Cycle to Mixer (Cy2Mixer), a novel spatio-temporal GNN based on topological non-trivial invariants of spatio-temporal graphs with gated multi-layer perceptrons (gMLP). The Cy2Mixer is composed of three blocks based on MLPs: A temporal block for capturing temporal properties, a message-passing block for encapsulating spatial information, and a cycle message-passing block for enriching topological information through cyclic subgraphs. We bolster the effectiveness of Cy2Mixer with mathematical evidence emphasizing that our cycle message-passing block is capable of offering differentiated information to the deep learning model compared to the message-passing block. Furthermore, empirical evaluations substantiate the efficacy of the Cy2Mixer, demonstrating state-of-the-art performances across various spatio-temporal benchmark datasets. The source code is available at https://anonymous.4open.science/r/cy2mixer-D5A9.} }
Endnote
%0 Conference Paper %T Enhancing Topological Dependencies in Spatio-Temporal Graphs With Cycle Message Passing Blocks %A Minho Lee %A Yun Young Choi %A Sun Woo Park %A Seunghwan Lee %A Joohwan Ko %A Jaeyoung Hong %B Proceedings of the Third Learning on Graphs Conference %C Proceedings of Machine Learning Research %D 2025 %E Guy Wolf %E Smita Krishnaswamy %F pmlr-v269-lee25a %I PMLR %P 33:1--33:17 %U https://proceedings.mlr.press/v269/lee25a.html %V 269 %X Graph Neural Networks (GNNs) and Transformer-based models have been increasingly adopted to learn the complex vector representations of spatio-temporal graphs, capturing intricate spatio-temporal dependencies crucial for applications such as traffic datasets. Although many existing methods utilize multi-head attention mechanisms and message-passing neural networks (MPNNs) to capture both spatial and temporal relations, these approaches encode temporal and spatial relations independently, and reflect the graph’s topological characteristics in a limited manner. In this work, we introduce the Cycle to Mixer (Cy2Mixer), a novel spatio-temporal GNN based on topological non-trivial invariants of spatio-temporal graphs with gated multi-layer perceptrons (gMLP). The Cy2Mixer is composed of three blocks based on MLPs: A temporal block for capturing temporal properties, a message-passing block for encapsulating spatial information, and a cycle message-passing block for enriching topological information through cyclic subgraphs. We bolster the effectiveness of Cy2Mixer with mathematical evidence emphasizing that our cycle message-passing block is capable of offering differentiated information to the deep learning model compared to the message-passing block. Furthermore, empirical evaluations substantiate the efficacy of the Cy2Mixer, demonstrating state-of-the-art performances across various spatio-temporal benchmark datasets. The source code is available at https://anonymous.4open.science/r/cy2mixer-D5A9.
APA
Lee, M., Choi, Y.Y., Park, S.W., Lee, S., Ko, J. & Hong, J.. (2025). Enhancing Topological Dependencies in Spatio-Temporal Graphs With Cycle Message Passing Blocks. Proceedings of the Third Learning on Graphs Conference, in Proceedings of Machine Learning Research 269:33:1-33:17 Available from https://proceedings.mlr.press/v269/lee25a.html.

Related Material