Rank Collapse Causes Over-Smoothing and Over-Correlation in Graph Neural Networks

Andreas Roth, Thomas Liebig
Proceedings of the Second Learning on Graphs Conference, PMLR 231:35:1-35:23, 2024.

Abstract

Our study reveals new theoretical insights into over-smoothing and feature over-correlation in graph neural networks. Specifically, we demonstrate that with increased depth, node representations become dominated by a low-dimensional subspace that depends on the aggregation function but not on the feature transformations. For all aggregation functions, the rank of the node representations collapses, resulting in over-smoothing for particular aggregation functions. Our study emphasizes the importance for future research to focus on rank collapse rather than over-smoothing. Guided by our theory, we propose a sum of Kronecker products as a beneficial property that provably prevents over-smoothing, over-correlation, and rank collapse. We empirically demonstrate the shortcomings of existing models in fitting target functions of node classification tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v231-roth24a, title = {Rank Collapse Causes Over-Smoothing and Over-Correlation in Graph Neural Networks}, author = {Roth, Andreas and Liebig, Thomas}, booktitle = {Proceedings of the Second Learning on Graphs Conference}, pages = {35:1--35:23}, year = {2024}, editor = {Villar, Soledad and Chamberlain, Benjamin}, volume = {231}, series = {Proceedings of Machine Learning Research}, month = {27--30 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v231/roth24a/roth24a.pdf}, url = {https://proceedings.mlr.press/v231/roth24a.html}, abstract = {Our study reveals new theoretical insights into over-smoothing and feature over-correlation in graph neural networks. Specifically, we demonstrate that with increased depth, node representations become dominated by a low-dimensional subspace that depends on the aggregation function but not on the feature transformations. For all aggregation functions, the rank of the node representations collapses, resulting in over-smoothing for particular aggregation functions. Our study emphasizes the importance for future research to focus on rank collapse rather than over-smoothing. Guided by our theory, we propose a sum of Kronecker products as a beneficial property that provably prevents over-smoothing, over-correlation, and rank collapse. We empirically demonstrate the shortcomings of existing models in fitting target functions of node classification tasks.} }
Endnote
%0 Conference Paper %T Rank Collapse Causes Over-Smoothing and Over-Correlation in Graph Neural Networks %A Andreas Roth %A Thomas Liebig %B Proceedings of the Second Learning on Graphs Conference %C Proceedings of Machine Learning Research %D 2024 %E Soledad Villar %E Benjamin Chamberlain %F pmlr-v231-roth24a %I PMLR %P 35:1--35:23 %U https://proceedings.mlr.press/v231/roth24a.html %V 231 %X Our study reveals new theoretical insights into over-smoothing and feature over-correlation in graph neural networks. Specifically, we demonstrate that with increased depth, node representations become dominated by a low-dimensional subspace that depends on the aggregation function but not on the feature transformations. For all aggregation functions, the rank of the node representations collapses, resulting in over-smoothing for particular aggregation functions. Our study emphasizes the importance for future research to focus on rank collapse rather than over-smoothing. Guided by our theory, we propose a sum of Kronecker products as a beneficial property that provably prevents over-smoothing, over-correlation, and rank collapse. We empirically demonstrate the shortcomings of existing models in fitting target functions of node classification tasks.
APA
Roth, A. & Liebig, T.. (2024). Rank Collapse Causes Over-Smoothing and Over-Correlation in Graph Neural Networks. Proceedings of the Second Learning on Graphs Conference, in Proceedings of Machine Learning Research 231:35:1-35:23 Available from https://proceedings.mlr.press/v231/roth24a.html.

Related Material