DRew: Dynamically Rewired Message Passing with Delay

Benjamin Gutteridge, Xiaowen Dong, Michael M. Bronstein, Francesco Di Giovanni
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:12252-12267, 2023.

Abstract

Message passing neural networks (MPNNs) have been shown to suffer from the phenomenon of over-squashing that causes poor performance for tasks relying on long-range interactions. This can be largely attributed to message passing only occurring locally, over a node’s immediate neighbours. Rewiring approaches attempting to make graphs ’more connected’, and supposedly better suited to long-range tasks, often lose the inductive bias provided by distance on the graph since they make distant nodes communicate instantly at every layer. In this paper we propose a framework, applicable to any MPNN architecture, that performs a layer-dependent rewiring to ensure gradual densification of the graph. We also propose a delay mechanism that permits skip connections between nodes depending on the layer and their mutual distance. We validate our approach on several long-range tasks and show that it outperforms graph Transformers and multi-hop MPNNs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-gutteridge23a, title = {{DR}ew: Dynamically Rewired Message Passing with Delay}, author = {Gutteridge, Benjamin and Dong, Xiaowen and Bronstein, Michael M. and Di Giovanni, Francesco}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {12252--12267}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/gutteridge23a/gutteridge23a.pdf}, url = {https://proceedings.mlr.press/v202/gutteridge23a.html}, abstract = {Message passing neural networks (MPNNs) have been shown to suffer from the phenomenon of over-squashing that causes poor performance for tasks relying on long-range interactions. This can be largely attributed to message passing only occurring locally, over a node’s immediate neighbours. Rewiring approaches attempting to make graphs ’more connected’, and supposedly better suited to long-range tasks, often lose the inductive bias provided by distance on the graph since they make distant nodes communicate instantly at every layer. In this paper we propose a framework, applicable to any MPNN architecture, that performs a layer-dependent rewiring to ensure gradual densification of the graph. We also propose a delay mechanism that permits skip connections between nodes depending on the layer and their mutual distance. We validate our approach on several long-range tasks and show that it outperforms graph Transformers and multi-hop MPNNs.} }
Endnote
%0 Conference Paper %T DRew: Dynamically Rewired Message Passing with Delay %A Benjamin Gutteridge %A Xiaowen Dong %A Michael M. Bronstein %A Francesco Di Giovanni %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-gutteridge23a %I PMLR %P 12252--12267 %U https://proceedings.mlr.press/v202/gutteridge23a.html %V 202 %X Message passing neural networks (MPNNs) have been shown to suffer from the phenomenon of over-squashing that causes poor performance for tasks relying on long-range interactions. This can be largely attributed to message passing only occurring locally, over a node’s immediate neighbours. Rewiring approaches attempting to make graphs ’more connected’, and supposedly better suited to long-range tasks, often lose the inductive bias provided by distance on the graph since they make distant nodes communicate instantly at every layer. In this paper we propose a framework, applicable to any MPNN architecture, that performs a layer-dependent rewiring to ensure gradual densification of the graph. We also propose a delay mechanism that permits skip connections between nodes depending on the layer and their mutual distance. We validate our approach on several long-range tasks and show that it outperforms graph Transformers and multi-hop MPNNs.
APA
Gutteridge, B., Dong, X., Bronstein, M.M. & Di Giovanni, F.. (2023). DRew: Dynamically Rewired Message Passing with Delay. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:12252-12267 Available from https://proceedings.mlr.press/v202/gutteridge23a.html.

Related Material