A New Perspective On the Expressive Equivalence Between Graph Convolution and Attention Models

Dai Shi, Zhiqi Shao, Andi Han, Yi Guo, Gao Junbin
Proceedings of the 15th Asian Conference on Machine Learning, PMLR 222:1199-1214, 2024.

Abstract

Graph neural networks (GNNs) have demonstrated impressive achievements in diverse graph tasks, and research on their expressive power has experienced significant growth in recent years. The well-known Weisfeiler and Lehman (WL) isomorphism test has been widely used to assess GNNs’ ability to distinguish graph structures. However, despite being considered less expressive than other GNNs in graph-level tasks based on the WL test, two prominent GNN models, namely graph convolution networks (GCN) and attention-based graph networks (GAT), still exhibit strong performance in node-level classification tasks. In this paper, we present a comprehensive analysis of their expressive power using a novel evaluation metric: the number of linear regions. We demonstrate that by enhancing GCN with refined graph Ricci curvature, our proposed high-rank graph convolution network (HRGCN) can match or even surpass the prediction advantage of attention models. Thus, the two models exhibit equivalent node-level expressive powers. This fresh perspective highlights the evaluation of GNNs’ expressive power in node-level classifications rather than solely at the graph level. Experimental results showcase that the proposed HRGCN model outperforms the state-of-the-art in various classification and prediction tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v222-shi24a, title = {A New Perspective On the Expressive Equivalence Between Graph Convolution and Attention Models}, author = {Shi, Dai and Shao, Zhiqi and Han, Andi and Guo, Yi and Junbin, Gao}, booktitle = {Proceedings of the 15th Asian Conference on Machine Learning}, pages = {1199--1214}, year = {2024}, editor = {Yanıkoğlu, Berrin and Buntine, Wray}, volume = {222}, series = {Proceedings of Machine Learning Research}, month = {11--14 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v222/shi24a/shi24a.pdf}, url = {https://proceedings.mlr.press/v222/shi24a.html}, abstract = {Graph neural networks (GNNs) have demonstrated impressive achievements in diverse graph tasks, and research on their expressive power has experienced significant growth in recent years. The well-known Weisfeiler and Lehman (WL) isomorphism test has been widely used to assess GNNs’ ability to distinguish graph structures. However, despite being considered less expressive than other GNNs in graph-level tasks based on the WL test, two prominent GNN models, namely graph convolution networks (GCN) and attention-based graph networks (GAT), still exhibit strong performance in node-level classification tasks. In this paper, we present a comprehensive analysis of their expressive power using a novel evaluation metric: the number of linear regions. We demonstrate that by enhancing GCN with refined graph Ricci curvature, our proposed high-rank graph convolution network (HRGCN) can match or even surpass the prediction advantage of attention models. Thus, the two models exhibit equivalent node-level expressive powers. This fresh perspective highlights the evaluation of GNNs’ expressive power in node-level classifications rather than solely at the graph level. Experimental results showcase that the proposed HRGCN model outperforms the state-of-the-art in various classification and prediction tasks.} }
Endnote
%0 Conference Paper %T A New Perspective On the Expressive Equivalence Between Graph Convolution and Attention Models %A Dai Shi %A Zhiqi Shao %A Andi Han %A Yi Guo %A Gao Junbin %B Proceedings of the 15th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Berrin Yanıkoğlu %E Wray Buntine %F pmlr-v222-shi24a %I PMLR %P 1199--1214 %U https://proceedings.mlr.press/v222/shi24a.html %V 222 %X Graph neural networks (GNNs) have demonstrated impressive achievements in diverse graph tasks, and research on their expressive power has experienced significant growth in recent years. The well-known Weisfeiler and Lehman (WL) isomorphism test has been widely used to assess GNNs’ ability to distinguish graph structures. However, despite being considered less expressive than other GNNs in graph-level tasks based on the WL test, two prominent GNN models, namely graph convolution networks (GCN) and attention-based graph networks (GAT), still exhibit strong performance in node-level classification tasks. In this paper, we present a comprehensive analysis of their expressive power using a novel evaluation metric: the number of linear regions. We demonstrate that by enhancing GCN with refined graph Ricci curvature, our proposed high-rank graph convolution network (HRGCN) can match or even surpass the prediction advantage of attention models. Thus, the two models exhibit equivalent node-level expressive powers. This fresh perspective highlights the evaluation of GNNs’ expressive power in node-level classifications rather than solely at the graph level. Experimental results showcase that the proposed HRGCN model outperforms the state-of-the-art in various classification and prediction tasks.
APA
Shi, D., Shao, Z., Han, A., Guo, Y. & Junbin, G.. (2024). A New Perspective On the Expressive Equivalence Between Graph Convolution and Attention Models. Proceedings of the 15th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 222:1199-1214 Available from https://proceedings.mlr.press/v222/shi24a.html.

Related Material