Continuous Graph Neural Networks

Louis-Pascal Xhonneux, Meng Qu, Jian Tang
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:10432-10441, 2020.

Abstract

This paper builds on the connection between graph neural networks and traditional dynamical systems. We propose continuous graph neural networks (CGNN), which generalise existing graph neural networks with discrete dynamics in that they can be viewed as a specific discretisation scheme. The key idea is how to characterise the continuous dynamics of node representations, i.e. the derivatives of node representations, w.r.t. time.Inspired by existing diffusion-based methods on graphs (e.g. PageRank and epidemic models on social networks), we define the derivatives as a combination of the current node representations,the representations of neighbors, and the initial values of the nodes. We propose and analyse two possible dynamics on graphs{—}including each dimension of node representations (a.k.a. the feature channel) change independently or interact with each other{—}both with theoretical justification. The proposed continuous graph neural net-works are robust to over-smoothing and hence allow us to build deeper networks, which in turn are able to capture the long-range dependencies between nodes. Experimental results on the task of node classification demonstrate the effectiveness of our proposed approach over competitive baselines.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-xhonneux20a, title = {Continuous Graph Neural Networks}, author = {Xhonneux, Louis-Pascal and Qu, Meng and Tang, Jian}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {10432--10441}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/xhonneux20a/xhonneux20a.pdf}, url = {https://proceedings.mlr.press/v119/xhonneux20a.html}, abstract = {This paper builds on the connection between graph neural networks and traditional dynamical systems. We propose continuous graph neural networks (CGNN), which generalise existing graph neural networks with discrete dynamics in that they can be viewed as a specific discretisation scheme. The key idea is how to characterise the continuous dynamics of node representations, i.e. the derivatives of node representations, w.r.t. time.Inspired by existing diffusion-based methods on graphs (e.g. PageRank and epidemic models on social networks), we define the derivatives as a combination of the current node representations,the representations of neighbors, and the initial values of the nodes. We propose and analyse two possible dynamics on graphs{—}including each dimension of node representations (a.k.a. the feature channel) change independently or interact with each other{—}both with theoretical justification. The proposed continuous graph neural net-works are robust to over-smoothing and hence allow us to build deeper networks, which in turn are able to capture the long-range dependencies between nodes. Experimental results on the task of node classification demonstrate the effectiveness of our proposed approach over competitive baselines.} }
Endnote
%0 Conference Paper %T Continuous Graph Neural Networks %A Louis-Pascal Xhonneux %A Meng Qu %A Jian Tang %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-xhonneux20a %I PMLR %P 10432--10441 %U https://proceedings.mlr.press/v119/xhonneux20a.html %V 119 %X This paper builds on the connection between graph neural networks and traditional dynamical systems. We propose continuous graph neural networks (CGNN), which generalise existing graph neural networks with discrete dynamics in that they can be viewed as a specific discretisation scheme. The key idea is how to characterise the continuous dynamics of node representations, i.e. the derivatives of node representations, w.r.t. time.Inspired by existing diffusion-based methods on graphs (e.g. PageRank and epidemic models on social networks), we define the derivatives as a combination of the current node representations,the representations of neighbors, and the initial values of the nodes. We propose and analyse two possible dynamics on graphs{—}including each dimension of node representations (a.k.a. the feature channel) change independently or interact with each other{—}both with theoretical justification. The proposed continuous graph neural net-works are robust to over-smoothing and hence allow us to build deeper networks, which in turn are able to capture the long-range dependencies between nodes. Experimental results on the task of node classification demonstrate the effectiveness of our proposed approach over competitive baselines.
APA
Xhonneux, L., Qu, M. & Tang, J.. (2020). Continuous Graph Neural Networks. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:10432-10441 Available from https://proceedings.mlr.press/v119/xhonneux20a.html.

Related Material