[edit]
A New Perspective On the Expressive Equivalence Between Graph Convolution and Attention Models
Proceedings of the 15th Asian Conference on Machine Learning, PMLR 222:1199-1214, 2024.
Abstract
Graph neural networks (GNNs) have demonstrated impressive achievements in diverse graph tasks, and research on their expressive power has experienced significant growth in recent years. The well-known Weisfeiler and Lehman (WL) isomorphism test has been widely used to assess GNNs’ ability to distinguish graph structures. However, despite being considered less expressive than other GNNs in graph-level tasks based on the WL test, two prominent GNN models, namely graph convolution networks (GCN) and attention-based graph networks (GAT), still exhibit strong performance in node-level classification tasks. In this paper, we present a comprehensive analysis of their expressive power using a novel evaluation metric: the number of linear regions. We demonstrate that by enhancing GCN with refined graph Ricci curvature, our proposed high-rank graph convolution network (HRGCN) can match or even surpass the prediction advantage of attention models. Thus, the two models exhibit equivalent node-level expressive powers. This fresh perspective highlights the evaluation of GNNs’ expressive power in node-level classifications rather than solely at the graph level. Experimental results showcase that the proposed HRGCN model outperforms the state-of-the-art in various classification and prediction tasks.