GOAT: A Global Transformer on Large-scale Graphs

Kezhi Kong, Jiuhai Chen, John Kirchenbauer, Renkun Ni, C. Bayan Bruss, Tom Goldstein
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:17375-17390, 2023.

Abstract

Graph transformers have been competitive on graph classification tasks, but they fail to outperform Graph Neural Networks (GNNs) on node classification, which is a common task performed on large-scale graphs for industrial applications. Meanwhile, existing GNN architectures are limited in their ability to perform equally well on both homophilious and heterophilious graphs as their inductive biases are generally tailored to only one setting. To address these issues, we propose GOAT, a scalable global graph transformer. In GOAT, each node conceptually attends to all the nodes in the graph and homophily/heterophily relationships can be learnt adaptively from the data. We provide theoretical justification for our approximate global self-attention scheme, and show it to be scalable to large-scale graphs. We demonstrate the competitiveness of GOAT on both heterophilious and homophilious graphs with millions of nodes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-kong23a, title = {{GOAT}: A Global Transformer on Large-scale Graphs}, author = {Kong, Kezhi and Chen, Jiuhai and Kirchenbauer, John and Ni, Renkun and Bruss, C. Bayan and Goldstein, Tom}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {17375--17390}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/kong23a/kong23a.pdf}, url = {https://proceedings.mlr.press/v202/kong23a.html}, abstract = {Graph transformers have been competitive on graph classification tasks, but they fail to outperform Graph Neural Networks (GNNs) on node classification, which is a common task performed on large-scale graphs for industrial applications. Meanwhile, existing GNN architectures are limited in their ability to perform equally well on both homophilious and heterophilious graphs as their inductive biases are generally tailored to only one setting. To address these issues, we propose GOAT, a scalable global graph transformer. In GOAT, each node conceptually attends to all the nodes in the graph and homophily/heterophily relationships can be learnt adaptively from the data. We provide theoretical justification for our approximate global self-attention scheme, and show it to be scalable to large-scale graphs. We demonstrate the competitiveness of GOAT on both heterophilious and homophilious graphs with millions of nodes.} }
Endnote
%0 Conference Paper %T GOAT: A Global Transformer on Large-scale Graphs %A Kezhi Kong %A Jiuhai Chen %A John Kirchenbauer %A Renkun Ni %A C. Bayan Bruss %A Tom Goldstein %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-kong23a %I PMLR %P 17375--17390 %U https://proceedings.mlr.press/v202/kong23a.html %V 202 %X Graph transformers have been competitive on graph classification tasks, but they fail to outperform Graph Neural Networks (GNNs) on node classification, which is a common task performed on large-scale graphs for industrial applications. Meanwhile, existing GNN architectures are limited in their ability to perform equally well on both homophilious and heterophilious graphs as their inductive biases are generally tailored to only one setting. To address these issues, we propose GOAT, a scalable global graph transformer. In GOAT, each node conceptually attends to all the nodes in the graph and homophily/heterophily relationships can be learnt adaptively from the data. We provide theoretical justification for our approximate global self-attention scheme, and show it to be scalable to large-scale graphs. We demonstrate the competitiveness of GOAT on both heterophilious and homophilious graphs with millions of nodes.
APA
Kong, K., Chen, J., Kirchenbauer, J., Ni, R., Bruss, C.B. & Goldstein, T.. (2023). GOAT: A Global Transformer on Large-scale Graphs. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:17375-17390 Available from https://proceedings.mlr.press/v202/kong23a.html.

Related Material