SLOG: An Inductive Spectral Graph Neural Network Beyond Polynomial Filter

Haobo Xu, Yuchen Yan, Dingsu Wang, Zhe Xu, Zhichen Zeng, Tarek F. Abdelzaher, Jiawei Han, Hanghang Tong
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:55348-55370, 2024.

Abstract

Graph neural networks (GNNs) have exhibited superb power in many graph related tasks. Existing GNNs can be categorized into spatial GNNs and spectral GNNs. The spatial GNNs primarily capture the local information around each node, while the spectral GNNs are able to operate on the frequency signals of the entire graph. However, most, if not all, existing spectral GNNs are faced with two limitations: (1) the polynomial limitation that for most spectral GNNs, the expressive power in the spectral domain is limited to polynomial filters; and (2) the transductive limitation that most spectral GNNs can only be applied to the transductive setting on relatively small-scale graphs. In this paper, we propose a novel spectral graph neural network named SLOG to solve the above two limitations. For the polynomial limitation, SLOG proposes a novel real-valued filter with geometric interpretability, mathematical feasibility and adaptive filtering ability to go beyond polynomial. For the transductive limitation, SLOG combines the subgraph sampling technique in spatial GNNs and the signal processing technique in spectral GNNs together to make itself tailored to the inductive setting on large-scale graphs. Extensive experimental results on 16 datasets demonstrate the superiority of SLOG in inductive homophilic and heterophilic node classification task.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-xu24aa, title = {{SLOG}: An Inductive Spectral Graph Neural Network Beyond Polynomial Filter}, author = {Xu, Haobo and Yan, Yuchen and Wang, Dingsu and Xu, Zhe and Zeng, Zhichen and Abdelzaher, Tarek F. and Han, Jiawei and Tong, Hanghang}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {55348--55370}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/xu24aa/xu24aa.pdf}, url = {https://proceedings.mlr.press/v235/xu24aa.html}, abstract = {Graph neural networks (GNNs) have exhibited superb power in many graph related tasks. Existing GNNs can be categorized into spatial GNNs and spectral GNNs. The spatial GNNs primarily capture the local information around each node, while the spectral GNNs are able to operate on the frequency signals of the entire graph. However, most, if not all, existing spectral GNNs are faced with two limitations: (1) the polynomial limitation that for most spectral GNNs, the expressive power in the spectral domain is limited to polynomial filters; and (2) the transductive limitation that most spectral GNNs can only be applied to the transductive setting on relatively small-scale graphs. In this paper, we propose a novel spectral graph neural network named SLOG to solve the above two limitations. For the polynomial limitation, SLOG proposes a novel real-valued filter with geometric interpretability, mathematical feasibility and adaptive filtering ability to go beyond polynomial. For the transductive limitation, SLOG combines the subgraph sampling technique in spatial GNNs and the signal processing technique in spectral GNNs together to make itself tailored to the inductive setting on large-scale graphs. Extensive experimental results on 16 datasets demonstrate the superiority of SLOG in inductive homophilic and heterophilic node classification task.} }
Endnote
%0 Conference Paper %T SLOG: An Inductive Spectral Graph Neural Network Beyond Polynomial Filter %A Haobo Xu %A Yuchen Yan %A Dingsu Wang %A Zhe Xu %A Zhichen Zeng %A Tarek F. Abdelzaher %A Jiawei Han %A Hanghang Tong %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-xu24aa %I PMLR %P 55348--55370 %U https://proceedings.mlr.press/v235/xu24aa.html %V 235 %X Graph neural networks (GNNs) have exhibited superb power in many graph related tasks. Existing GNNs can be categorized into spatial GNNs and spectral GNNs. The spatial GNNs primarily capture the local information around each node, while the spectral GNNs are able to operate on the frequency signals of the entire graph. However, most, if not all, existing spectral GNNs are faced with two limitations: (1) the polynomial limitation that for most spectral GNNs, the expressive power in the spectral domain is limited to polynomial filters; and (2) the transductive limitation that most spectral GNNs can only be applied to the transductive setting on relatively small-scale graphs. In this paper, we propose a novel spectral graph neural network named SLOG to solve the above two limitations. For the polynomial limitation, SLOG proposes a novel real-valued filter with geometric interpretability, mathematical feasibility and adaptive filtering ability to go beyond polynomial. For the transductive limitation, SLOG combines the subgraph sampling technique in spatial GNNs and the signal processing technique in spectral GNNs together to make itself tailored to the inductive setting on large-scale graphs. Extensive experimental results on 16 datasets demonstrate the superiority of SLOG in inductive homophilic and heterophilic node classification task.
APA
Xu, H., Yan, Y., Wang, D., Xu, Z., Zeng, Z., Abdelzaher, T.F., Han, J. & Tong, H.. (2024). SLOG: An Inductive Spectral Graph Neural Network Beyond Polynomial Filter. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:55348-55370 Available from https://proceedings.mlr.press/v235/xu24aa.html.

Related Material