[edit]
From Relational Pooling to Subgraph GNNs: A Universal Framework for More Expressive Graph Neural Networks
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:42742-42768, 2023.
Abstract
Relational pooling is a framework for building more expressive and permutation-invariant graph neural networks. However, there is limited understanding of the exact enhancement in the expressivity of RP and its connection with the Weisfeiler-Lehman hierarchy. Starting from RP, we propose to explicitly assign labels to nodes as additional features to improve graph isomorphism distinguishing power of message passing neural networks. The method is then extended to higher-dimensional WL, leading to a novel $k,l$-WL algorithm, a more general framework than $k$-WL. We further introduce the subgraph concept into our hierarchy and propose a localized $k,l$-WL framework, incorporating a wide range of existing work, including many subgraph GNNs. Theoretically, we analyze the expressivity of $k,l$-WL w.r.t. $k$ and $l$ and compare it with the traditional $k$-WL. Complexity reduction methods are also systematically discussed to build powerful and practical $k,l$-GNN instances. We theoretically and experimentally prove that our method is universally compatible and capable of improving the expressivity of any base GNN model. Our $k,l$-GNNs achieve superior performance on many synthetic and real-world datasets, which verifies the effectiveness of our framework.