Improving Breadth-Wise Backpropagation in Graph Neural Networks Helps Learning Long-Range Dependencies.

Denis Lukovnikov, Asja Fischer
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:7180-7191, 2021.

Abstract

In this work, we focus on the ability of graph neural networks (GNNs) to learn long-range patterns in graphs with edge features. Learning patterns that involve longer paths in the graph, requires using deeper GNNs. However, GNNs suffer from a drop in performance with increasing network depth. To improve the performance of deeper GNNs, previous works have investigated normalization techniques and various types of skip connections. While they are designed to improve depth-wise backpropagation between the representations of the same node in successive layers, they do not improve breadth-wise backpropagation between representations of neighbouring nodes. To analyse the consequences, we design synthetic datasets serving as a testbed for the ability of GNNs to learn long-range patterns. Our analysis shows that several commonly used GNN variants with only depth-wise skip connections indeed have problems learning long-range patterns. They are clearly outperformed by an attention-based GNN architecture that we propose for improving both depth- and breadth-wise backpropagation. We also verify that the presented architecture is competitive on real-world data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-lukovnikov21a, title = {Improving Breadth-Wise Backpropagation in Graph Neural Networks Helps Learning Long-Range Dependencies.}, author = {Lukovnikov, Denis and Fischer, Asja}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {7180--7191}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/lukovnikov21a/lukovnikov21a.pdf}, url = {https://proceedings.mlr.press/v139/lukovnikov21a.html}, abstract = {In this work, we focus on the ability of graph neural networks (GNNs) to learn long-range patterns in graphs with edge features. Learning patterns that involve longer paths in the graph, requires using deeper GNNs. However, GNNs suffer from a drop in performance with increasing network depth. To improve the performance of deeper GNNs, previous works have investigated normalization techniques and various types of skip connections. While they are designed to improve depth-wise backpropagation between the representations of the same node in successive layers, they do not improve breadth-wise backpropagation between representations of neighbouring nodes. To analyse the consequences, we design synthetic datasets serving as a testbed for the ability of GNNs to learn long-range patterns. Our analysis shows that several commonly used GNN variants with only depth-wise skip connections indeed have problems learning long-range patterns. They are clearly outperformed by an attention-based GNN architecture that we propose for improving both depth- and breadth-wise backpropagation. We also verify that the presented architecture is competitive on real-world data.} }
Endnote
%0 Conference Paper %T Improving Breadth-Wise Backpropagation in Graph Neural Networks Helps Learning Long-Range Dependencies. %A Denis Lukovnikov %A Asja Fischer %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-lukovnikov21a %I PMLR %P 7180--7191 %U https://proceedings.mlr.press/v139/lukovnikov21a.html %V 139 %X In this work, we focus on the ability of graph neural networks (GNNs) to learn long-range patterns in graphs with edge features. Learning patterns that involve longer paths in the graph, requires using deeper GNNs. However, GNNs suffer from a drop in performance with increasing network depth. To improve the performance of deeper GNNs, previous works have investigated normalization techniques and various types of skip connections. While they are designed to improve depth-wise backpropagation between the representations of the same node in successive layers, they do not improve breadth-wise backpropagation between representations of neighbouring nodes. To analyse the consequences, we design synthetic datasets serving as a testbed for the ability of GNNs to learn long-range patterns. Our analysis shows that several commonly used GNN variants with only depth-wise skip connections indeed have problems learning long-range patterns. They are clearly outperformed by an attention-based GNN architecture that we propose for improving both depth- and breadth-wise backpropagation. We also verify that the presented architecture is competitive on real-world data.
APA
Lukovnikov, D. & Fischer, A.. (2021). Improving Breadth-Wise Backpropagation in Graph Neural Networks Helps Learning Long-Range Dependencies.. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:7180-7191 Available from https://proceedings.mlr.press/v139/lukovnikov21a.html.

Related Material