[edit]
GwAC: GNNs With Asynchronous Communication
Proceedings of the Second Learning on Graphs Conference, PMLR 231:8:1-8:20, 2024.
Abstract
This paper studies the relation between Graph Neural Networks and Distributed Computing Models to propose a new framework for Learning in Graphs. Current Graph Neural Networks (GNNs) are closely related to the synchronous model from distributed computing. Nodes operate in rounds and receive neighborhood information aggregated and at the same time. Our new framework, on the other hand, proposes GNNs with Asynchronous Communication: Every message is received individually and at potentially different times. We prove this framework must be at least as expressive as the existing synchronous framwork. We further analyze GwAC theoretically and practically with regard to several GNN problems: Expressiveness beyond 1-Weisfeiler Lehman (1WL), Underreaching, and Oversmoothing. GwAC shows promising improvements for all problems. We finish with a practical study on how to implement GwAC GNNs efficiently.