Degree-based stratification of nodes in Graph Neural Networks

Ameen Ali, Lior Wolf, Hakan Cevikalp
Proceedings of the 15th Asian Conference on Machine Learning, PMLR 222:15-27, 2024.

Abstract

Despite much research, Graph Neural Networks (GNNs) still do not display the favorable scaling properties of other deep neural networks such as Convolutional Neural Networks and Transformers. Previous work has identified issues such as oversmoothing of the latent representation and have suggested solutions such as skip connections and sophisticated normalization schemes. Here, we propose a different approach that is based on a stratification of the graph nodes. We provide motivation that the nodes in a graph can be stratified into those with a low degree and those with a high degree and that the two groups are likely to behave differently. Based on this motivation, we modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group. This simple-to-implement modification seems to improve performance across datasets and GNN methods. To verify that this increase in performance is not only due to the added capacity, we also perform the same modification for random splits of the nodes, which does not lead to any improvement.

Cite this Paper


BibTeX
@InProceedings{pmlr-v222-ali24a, title = {Degree-based stratification of nodes in Graph Neural Networks}, author = {Ali, Ameen and Wolf, Lior and Cevikalp, Hakan}, booktitle = {Proceedings of the 15th Asian Conference on Machine Learning}, pages = {15--27}, year = {2024}, editor = {Yanıkoğlu, Berrin and Buntine, Wray}, volume = {222}, series = {Proceedings of Machine Learning Research}, month = {11--14 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v222/ali24a/ali24a.pdf}, url = {https://proceedings.mlr.press/v222/ali24a.html}, abstract = {Despite much research, Graph Neural Networks (GNNs) still do not display the favorable scaling properties of other deep neural networks such as Convolutional Neural Networks and Transformers. Previous work has identified issues such as oversmoothing of the latent representation and have suggested solutions such as skip connections and sophisticated normalization schemes. Here, we propose a different approach that is based on a stratification of the graph nodes. We provide motivation that the nodes in a graph can be stratified into those with a low degree and those with a high degree and that the two groups are likely to behave differently. Based on this motivation, we modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group. This simple-to-implement modification seems to improve performance across datasets and GNN methods. To verify that this increase in performance is not only due to the added capacity, we also perform the same modification for random splits of the nodes, which does not lead to any improvement.} }
Endnote
%0 Conference Paper %T Degree-based stratification of nodes in Graph Neural Networks %A Ameen Ali %A Lior Wolf %A Hakan Cevikalp %B Proceedings of the 15th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Berrin Yanıkoğlu %E Wray Buntine %F pmlr-v222-ali24a %I PMLR %P 15--27 %U https://proceedings.mlr.press/v222/ali24a.html %V 222 %X Despite much research, Graph Neural Networks (GNNs) still do not display the favorable scaling properties of other deep neural networks such as Convolutional Neural Networks and Transformers. Previous work has identified issues such as oversmoothing of the latent representation and have suggested solutions such as skip connections and sophisticated normalization schemes. Here, we propose a different approach that is based on a stratification of the graph nodes. We provide motivation that the nodes in a graph can be stratified into those with a low degree and those with a high degree and that the two groups are likely to behave differently. Based on this motivation, we modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group. This simple-to-implement modification seems to improve performance across datasets and GNN methods. To verify that this increase in performance is not only due to the added capacity, we also perform the same modification for random splits of the nodes, which does not lead to any improvement.
APA
Ali, A., Wolf, L. & Cevikalp, H.. (2024). Degree-based stratification of nodes in Graph Neural Networks. Proceedings of the 15th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 222:15-27 Available from https://proceedings.mlr.press/v222/ali24a.html.

Related Material