Geometric Random Walk Graph Neural Networks via Implicit Layers

Giannis Nikolentzos, Michalis Vazirgiannis
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:2035-2053, 2023.

Abstract

Graph neural networks have recently attracted a lot of attention and have been applied with great success to several important graph problems. The Random Walk Graph Neural Network model was recently proposed as a more intuitive alternative to the well-studied family of message passing neural networks. This model compares each input graph against a set of latent “hidden graphs” using a kernel that counts common random walks up to some length. In this paper, we propose a new architecture, called Geometric Random Walk Graph Neural Network (GRWNN), that generalizes the above model such that it can count common walks of infinite length in two graphs. The proposed model retains the transparency of Random Walk Graph Neural Networks since its first layer also consists of a number of trainable “hidden graphs” which are compared against the input graphs using the geometric random walk kernel. To compute the kernel, we employ a fixed-point iteration approach involving implicitly defined operations. Then, we capitalize on implicit differentiation to derive an efficient training scheme which requires only constant memory, regardless of the number of fixed-point iterations. Experiments on graph classification datasets demonstrate the effectiveness of the proposed approach in comparison with state-of-the-art methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-nikolentzos23c, title = {Geometric Random Walk Graph Neural Networks via Implicit Layers}, author = {Nikolentzos, Giannis and Vazirgiannis, Michalis}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {2035--2053}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/nikolentzos23c/nikolentzos23c.pdf}, url = {https://proceedings.mlr.press/v206/nikolentzos23c.html}, abstract = {Graph neural networks have recently attracted a lot of attention and have been applied with great success to several important graph problems. The Random Walk Graph Neural Network model was recently proposed as a more intuitive alternative to the well-studied family of message passing neural networks. This model compares each input graph against a set of latent “hidden graphs” using a kernel that counts common random walks up to some length. In this paper, we propose a new architecture, called Geometric Random Walk Graph Neural Network (GRWNN), that generalizes the above model such that it can count common walks of infinite length in two graphs. The proposed model retains the transparency of Random Walk Graph Neural Networks since its first layer also consists of a number of trainable “hidden graphs” which are compared against the input graphs using the geometric random walk kernel. To compute the kernel, we employ a fixed-point iteration approach involving implicitly defined operations. Then, we capitalize on implicit differentiation to derive an efficient training scheme which requires only constant memory, regardless of the number of fixed-point iterations. Experiments on graph classification datasets demonstrate the effectiveness of the proposed approach in comparison with state-of-the-art methods.} }
Endnote
%0 Conference Paper %T Geometric Random Walk Graph Neural Networks via Implicit Layers %A Giannis Nikolentzos %A Michalis Vazirgiannis %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-nikolentzos23c %I PMLR %P 2035--2053 %U https://proceedings.mlr.press/v206/nikolentzos23c.html %V 206 %X Graph neural networks have recently attracted a lot of attention and have been applied with great success to several important graph problems. The Random Walk Graph Neural Network model was recently proposed as a more intuitive alternative to the well-studied family of message passing neural networks. This model compares each input graph against a set of latent “hidden graphs” using a kernel that counts common random walks up to some length. In this paper, we propose a new architecture, called Geometric Random Walk Graph Neural Network (GRWNN), that generalizes the above model such that it can count common walks of infinite length in two graphs. The proposed model retains the transparency of Random Walk Graph Neural Networks since its first layer also consists of a number of trainable “hidden graphs” which are compared against the input graphs using the geometric random walk kernel. To compute the kernel, we employ a fixed-point iteration approach involving implicitly defined operations. Then, we capitalize on implicit differentiation to derive an efficient training scheme which requires only constant memory, regardless of the number of fixed-point iterations. Experiments on graph classification datasets demonstrate the effectiveness of the proposed approach in comparison with state-of-the-art methods.
APA
Nikolentzos, G. & Vazirgiannis, M.. (2023). Geometric Random Walk Graph Neural Networks via Implicit Layers. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:2035-2053 Available from https://proceedings.mlr.press/v206/nikolentzos23c.html.

Related Material